Shopping? Check out our latest product comparisons

Mind-controlled robot avatars inch towards reality

By

November 13, 2012

A researcher minds the robot's balance as it is commanded to pick up a canned drink by an ...

A researcher minds the robot's balance as it is commanded to pick up a canned drink by an operator (off camera)

Image Gallery (5 images)

Researchers at the CNRS-AIST Joint Robotics Laboratory (a collaboration between France's Centre National de la Recherche Scientifique and Japan's National Institute of Advanced Industrial Science and Technology) are developing software that allows a person to drive a robot with their thoughts alone. The technology could one day give a paralyzed patient greater autonomy through a robotic agent or avatar.

The system requires that a patient concentrate their attention on a symbol displayed on a computer screen (such as a flashing arrow). An electroencephalography (EEG) cap outfitted with electrodes reads the electrical activity in their brain, which is interpreted by a signal processor. Finally, the desired command is sent to the robot to carry out.

The system does not provide direct fine-grain motor control: the robot is simply performing a preset action such as walking forward, turning right or left, and so on. The robot's artificial intelligence, developed over several years at the lab, allows it to perform more delicate tasks such as picking up an object from a table without needing human input. In this scenario, the robot's camera images are parsed by object recognition software, allowing the patient to choose one of the objects on a table by focusing their attention on it.

Object recognition software automatically detects and highlights the bottled water and can...

Object recognition software automatically detects and highlights the bottled water and canned drink in the robot's camera images, and by focusing on one of them the patient can command the robot to retrieve it

With training, the user can direct the robot's movements and pick up beverages or other objects in their surroundings. The system can be seen in use in the DigInfo video at the bottom of the page.

This is similar to but more sophisticated than previous projects, one involving Honda's ASIMO robot from 2006, and another at the University of Washington from 2007.

A different but more direct approach would be to track a patient's eye movements. Recent research conducted at the Université Pierre et Marie Curie-Paris enabled cursive writing on a computer screen through eye movement alone. The same technology could allow a patient to move a cursor and select from a multitude of action icons without having to go through the EEG middle-man. The hitch is that – in some circumstances – eye movement isn't possible or can't be tracked reliably due to eye conditions. In that case, brain implants may be the way to go.

No matter how you slice it, researchers aren't giving up, and with further progress robot avatars may cease being the stuff of science fiction. No doubt patients would feel empowered and liberated by this technology, but it will be awhile before it can be implemented, and the robots being deployed will likely look more like Toyota's recently unveiled Human Support Robot than advanced bipedal robots.

Source: AIST-CNRS JRL (Japanese) via DigInfo News

About the Author
Jason Falconer Jason is a freelance writer based in central Canada with a background in computer graphics. He has written about hundreds of humanoid robots on his website Plastic Pals and is an avid gamer with an unsightly collection of retro consoles, cartridges, and controllers.   All articles by Jason Falconer
Tags
Post a Comment

Login with your gizmag account:

Or Login with Facebook:


Related Articles
Looking for something? Search our 28,126 articles