Science

Researchers develop interactive, emotion-detecting GPS robot

Researchers develop interactive, emotion-detecting GPS robot
Researchers from the University of Cambridge have developed a computer system that's able to recognize and respond to the emotions and gestures of its human user, and outputs its own response via a specially-designed robot
Researchers from the University of Cambridge have developed a computer system that's able to recognize and respond to the emotions and gestures of its human user, and outputs its own response via a specially-designed robot
View 1 Image
Researchers from the University of Cambridge have developed a computer system that's able to recognize and respond to the emotions and gestures of its human user, and outputs its own response via a specially-designed robot
1/1
Researchers from the University of Cambridge have developed a computer system that's able to recognize and respond to the emotions and gestures of its human user, and outputs its own response via a specially-designed robot

While computer systems are now very capable of recognizing vocal input, they offer minimal interactive feedback. A team of Cambridge University researchers have now developed a system that can not only detect a user's emotional state, but can also make expressive responses of its own. Using a robotic likeness of the godfather of the programmable computer, Charles Babbage, the team has hooked the system up to a driving simulator and created a computerized driving companion and navigator that reacts to the driver in much the same way as a human passenger.

When people talk to each other, we gauge much of how the other person feels by their facial expression and/or from how something is said. We pick up emotions from tone of voice and facial expressions that allow us to decide how best to continue our conversations – to empathize with sadness, share happiness, or react to anger or frustration. People even express themselves in this way when interacting with machines, but the devices don't seem capable of reacting to how a person feels.

Team leader Professor Peter Robinson challenged his team from the Rainbow Graphics and Interaction Research Group to build a system that could "understand not just what I'm saying, but how I'm saying it."

Helping computers to understand emotion

Over the years, computer interfaces have become quite proficient at taking raw input and transforming it into digital data. Voice recognition software is now quite a capable means of entering information into a computer system. Getting a computer to talk back as we would is another matter, so the researchers drew on the extensive knowledge of their colleagues from Cambridge University's Autism Research Center, who study the difficulties that some people have understanding emotions.

They developed a computer system that tracks feature points on a user's face via a camera and then compares the input with entries in a database of hundreds of predefined mental states to interpret the combinations of gestures as emotions. The system also compares the tempo, pitch and emphasis of the user's voice with the same database. Body movement is also important, so posture and gestures are also interpreted using the same criteria.

The three measures were combined to form an overall picture of the emotional state of the user, and is said to achieve an accuracy of around 70 per cent – about the same as most of us. Robinson, however, also wanted a system capable of expressing itself.

Making the jump off the screen

The first step in the process was to record some common facial expressions and then use software to generate an expressive digital character on a computer screen. To bring the setup off the screen and into our world, the team used the robotic head and shoulders of a chimp (called Virgil) and then made an evolutionary jump to use a robotic bust of Elvis Presley.

Finally, Charles was created and the whole shebang was hooked up to a driving simulator. The robot has cameras in each eye to track and register the expressions and gestures of the user, while two dozen motors control its facial expressions to add a frankly spooky level of realism to the proceedings. In addition to acting somewhat like a GPS navigator, but one that responds to user input on-the-fly, Charles can make decisions based on the emotional state of the driver – for instance, the system could limit distractions from the radio or navigation system during periods of stress.

Robinson sees Charles as being a hint of a possible future in human-computer interaction. Let's hope that future incarnations are a little easier on the eye...

Cambridge Ideas - The Emotional Computer

3 comments
3 comments
Mr Stiffy
Jeezers Krist... this is creepy - this is like living in a perpetual state of molestation.
Everything is just AT you, or you have to endure forceable interaction... from robot answering machines at the bank to ...... and this....
Time to go mad and declare this to be an effigy of satan and burn it.
windykites
in the picture, which one is the robot? Surely the robot should look like your wife?
windykites
I\'ve just watched the video after my last posting. What really made me laugh was the expression on the robot\'s face when Robinson said, \"I think this could be the beginning of a beautiful relationship\". Actually, I was quite impressed with the robot\'s movements, but the wires at the back of the head reminded me of those poor little monkeys with the brain implants. if Charles used the same information as the Sat Nav, he would have repeated the same message in the same calm voice: continue for 300 yards and turn left; continue for 300 yards and turn left; continue...... Thump!