Megan Garber introduces readers to “Ellie,” a virtual therapist designed to “suss out symptoms of anxiety, depression, and – of particular interest to [project funders] DARPA – PTSD”:
[USC professor Louis-Philippe] Morency has been working with Ellie – part of the university’s SimSensei project – for several years now. In that, he has helped to build a program capable of reading and responding to human emotion in real time. And capable, more to the point, of offering those responses via a human-like animation.
To build that system, the SimSensei team took a three-step approach: first, they analyzed actual humans interacting, to observe the linguistic and behavioral nuances of those conversations. From there, they created a kind of intermediate step they nicknamed the “Wizard of Oz.” This was “a virtual human,” as Morency describes it, “with a human behind, pressing the buttons.” Once they had a framework for the rhythms of a face-to-face therapy session, they added facial-movement sensors and dialogue managers – creating a system, all in all, that can read and react to human emotion. And to all that they added animation modules, giving a body – well, a “body” – to their program. Ellie was born.