Have you ever visited a website and been invited to chat with their virtual assistant? It’s becoming more and more commonplace for companies to use technology as a way to interact with customers. A friendly face (even though it's a cartoon) who never gets frustrated, never responds to sarcasm or swearing and only appears to want to help, is considered a step up from a cold list of frequently asked questions.
Similarly, anyone familiar with an Xbox gaming console will probably also be aware of the webcam motion- sensing device known as Kinect. To date the device has largely been used for gesture recognition in order to support a variety of games and activities such as virtual tennis, skiing, fitness programs or other forms of interactivity. Here we can move alongside or even compete physically with the avatar on the screen to make it a more immersive and believable experience.
Clinical applications have been somewhat overlooked. Recently however, the device has been tested with that well known, but somewhat hit-and-miss aspect of depression, body language.
Body posture, facial expression and body language generally, speak volumes about the way a person is feeling. Dragging limbs, stooped shoulders, sighing, face rubbing, motionless expressions, dull eyes, pained expressions, lack of eye contact. The list goes on and on. With so many signs you’d think it would be virtually impossible to overlook the possibility of depression during a clinical consultation, yet it is more common that we might like to think. Why? Well there are various reasons. One is the fact that if you don’t speak about depression as the main symptom it’s possible the supporting symptoms (upset stomach, aches and pains, sleeplessness, etc.) take priority. A more astute doctor may wonder if these are signs of depression, but there is still something of a tendency for them to resort to a handful of short-answer questions in order to test the hypothesis. All the time the doctor is checking off his questionnaire the subtle little changes in body language and facial expression of their patient may be being overlooked.
The SimSensei program, developed by researchers at the University of Southern California’s Institute for Creative Technologies, uses Kinect to detect and record various movements during a virtual consultation.
According to its developers “SimSensei is a virtual human platform specifically designed for healthcare support and is based on the 10+ years of expertise at ICT with virtual human research and development. The platform enables an engaging face-to-face interaction where a virtual person reacts to the perceived user state and intent, through its own speech and gestures.”
In practice the on-screen avatar introduces herself as Ellie, states she’s not a therapist, and follows this up by saying she’s interested in people and would love to know about you. The avatar responses are interesting as we watch the program adapting and responding to the eye movement, verbal pauses, fidgeting and so on of a real-life ‘patient’.
Quite what refinements are required before such devices are incorporated into clinical consultations remains unknown but the sharing of expertise from clinical psychology, cognitive science, speech processing and artificial intelligence makes for an interesting development and possibly another tool in the diagnostic bag.
As to whether people will be comfortable exchanging intimate details with a virtual person is another thing entirely but you can see the program in action on this YouTube video. Incidentally, for this demonstration actors play the part of interviewees/patients.
Published On: April 03, 2013