Brown University develops autonomous, gesture-following robot
By Kyle Sherer
March 20, 2009
March 20, 2009 A team from Brown University has developed a robot capable of following verbal and nonverbal commands in indoor and outdoor environments. Based on iRobot's brain-trust, previously known for developing PackBot, the machine was presented at the Human-Robot Interaction conference from March 11-13.
"We have created a novel system where the robot will follow you at a precise distance, where you don't need to wear special clothing, you don't need to be in a special environment, and you don't need to look backward to track it," said Chad Jenkins, assistant professor of computer science at Brown University and the team's leader.
A video presentation showed the robot following a person through corridors and an outside parking lot, maintaining a three-foot distance. The robot responded to hand signals gesturing it to follow, stop, and breach doors, and correctly acted out combinations of signals – including an order to go through a door, stop, turn around, and return to its starting point.
The PackBot is an ordnance disposal robot used in Iraq and Afghanistan. But unlike its predecessor, the new Brown University robot does not require a remote control. The robot is equipped with a visual sensor that allows it to distinguish human silhouettes, and a CSEM Swiss Ranger infra-red sensor that allows it to gauge depth. Together, the sensors provide the robot with the necessary means to interpret and carry out instructions from a nearby person.
"What you really want is a robot that can act like a partner," Jenkins said. "You don't want to puppeteer the robot. You just want to supervise it, where you say, 'Here's your job. Now, go do it.'"