Researchers at the University of Southern California’s Institute for Creative Sciences (ICT) are developing a virtual therapist that can identify signs of depression, anxiety, and post-traumatic stress disorder (PTSD). Bringing together machine learning, natural language processing and computer vision technologies, the SimSensei project is aimed at helping military personnel and their families, while reducing the stigma that is often associated with seeking help.
The technology is actually made up of two systems; the virtual human "SimSensei" with which the person interacts, and the Multisense system. As well as listening to what the patient says, the Multisense system also tracks and analyses a range of signals they give off, ranging from facial expressions and body posture to voice and language patterns. It can also detect whether the person is paying attention or fidgeting and getting distracted. This data is analyzed in real time to look for indicators of psychological distress and to inform SimSensei's responses.
The researchers say the system is not aimed at providing an exact diagnosis. Instead, the objective is to provide what they call a “general metric” of psychological health that will help clinicians detect a potential stress disorder sooner. The system can also create a long-term profile of the patient, a kind of timeline that chronicles changes over time, to help identify problems before they progress too far.
While the SimSensei and Multisense systems are still at the prototype stage, ICT has already unleashed a few samples of virtual humanity into the world. Since 2009, visitors to the Boston Museum of Science (MOS) can meet Ada and Grace, two virtual visitor guides who can also answer questions about the museum. Also at MOS, in Cahner’s ComputerPlace, Coach Mike can be found helping young visitors learn about robotics.
The video below shows a couple of people having a session with SimSensei.
Source: ICT