Robotics

Interactive Urban Robot needs you to give it directions

Interactive Urban Robot needs you to give it directions
IURO, the Interactive Urban Robot, will ask people for directions in an unfamiliar world (Photo: Bartlomiej Stanczyk)
IURO, the Interactive Urban Robot, will ask people for directions in an unfamiliar world (Photo: Bartlomiej Stanczyk)
View 4 Images
IURO, the Interactive Urban Robot, will ask people for directions in an unfamiliar world (Photo: Bartlomiej Stanczyk)
1/4
IURO, the Interactive Urban Robot, will ask people for directions in an unfamiliar world (Photo: Bartlomiej Stanczyk)
IURO's face communicates various expressions with 21 servo motors (Photo: Bartlomiej Stanczyk)
2/4
IURO's face communicates various expressions with 21 servo motors (Photo: Bartlomiej Stanczyk)
A look inside IURO's face (Photo: Bartlomiej Stanczyk)
3/4
A look inside IURO's face (Photo: Bartlomiej Stanczyk)
Would you give this robot directions if it approached you on the street? (Photo: Bartlomiej Stanczyk)
4/4
Would you give this robot directions if it approached you on the street? (Photo: Bartlomiej Stanczyk)
View gallery - 4 images

The IURO (short for Interactive Urban Robot) is a new humanoid service robot built by Accrea Engineering, a spin-off of the Technical University of Munich (TUM). Researchers at TUM as well as ETH Zurich and the University of Salzburg are collaborating on an EU-funded research project that seeks to teach robots directions—the human way.

IURO has undergone some changes since we saw it late last year. It now possesses an expressive head with moving eyes, eyelids, eyebrows, lips, lower jaw, and even ears, actuated by a total of 21 servo motors. The goal is to make it approachable and understandable to the general public, and having an expressive face helps.

It recognizes people and its surroundings with stereo cameras (located in the holes in its forehead) and has better depth perception thanks to a Kinect sensor. Its arms are just for show, it has a built-in touch screen interface, and it moves on wheels (using laser range finders to prevent bumping into stuff).

The fact its makers are sending IURO out into the real world without it knowing where to go is what makes the project unique. The idea is to have the robot approach people on the street and ask them for directions to a specific landmark. Then, using what it is told, it must reach its destination. To do so, it must be able to visually recognize and track people while interacting with them through speech synthesis and recognition. If a person gives it directions (which will likely be somewhat vague by robot standards), IURO will have to translate their words and gestures into navigational plans. And that's a tall order for just about any robot.

Below, you can watch a brief overview of the project in the following interview from the International Conference on Intelligent Robots and Systems 2012 conference last week.

Source: Iuro Project and Accrea Engineering via IEEE Spectrum

IURO Robot at IROS 2012

View gallery - 4 images
1 comment
1 comment
Joel Detrow
Is it too much to ask for the guys who build these things to just put a mask over the face if they can't get it to not look creepy?
Seriously, the problem with all these is that they try too hard to make it look human, or at the very least, they try to do it mechanically. It's too much too soon.
Anybody seen Wall-E? We had no trouble figuring out Wall-E's or Eve's "emotions" as it were, yet all they had was their eyes and body language. When this machine is sitting still, it's creepy enough, but then it moves and I get the urge to OH GOD KILL IT WITH FIRE
A mouth is not necessary for a robot to express moods, because we don't pay attention as much to the mouth as we do the eyes - placement, angle, eyelid openings, traveling gaze, etc.
I bet if the face was like that of Eve from Wall-E - virtual and stylized rather than cripplingly humanoid, it would come across as far less creepy, and actually be able to express a wider range of emotions.
All that said, I admire the work these guys are doing, trying to get a robot to interact with humans, interpret what they're saying, and get useful information from that. It's a monumental challenge, but I'm sure that with enough trial, they'll get it done. We already have Watson, so I'm sure interaction with human beings is very close.