Introducing the Gizmag Store

Brain implant lets monkeys control virtual hand and feel virtual objects

By

October 6, 2011

The virtual arm controlled by a monkey selects an object based on its virtual texture

The virtual arm controlled by a monkey selects an object based on its virtual texture

In a development that could have huge implications for quadriplegics, paraplegics and those with prosthetic limbs, researchers from Duke University and the Ecole Polytechnic Federale de Lausanne (EPFL) have developed technology that has allowed monkeys to control a virtual arm and touch and feel virtual objects using only their brain activity. The researchers say it is the first-ever demonstration of a two-way interaction between a primate brain and a virtual body and could lead to robotic exoskeletons that not only that allows paralyzed patients to walk again, but to also feel the ground beneath them.

By inserting electrodes in the regions of the brain involved in the planning, control, and execution of voluntary movement - the motor cortex - and the area that processes input received from cells in the body that are sensitive to sensory experiences, including touch - the somatosensory cortex - the researchers were able to train two monkeys to use their electrical brain activity to direct the virtual hands of an avatar to the surface of visually identical virtual objects and, upon contact, differentiate them based on their textures.

"This is the first demonstration of a brain-machine-brain interface (BMBI) that establishes a direct, bidirectional link between a brain and a virtual body," said study leader Miguel Nicolelis, MD, PhD, professor of neurobiology at Duke University Medical Center and co-director of the Duke Center for Neuroengineering. "In this BMBI, the virtual body is controlled directly by the animal's brain activity, while its virtual hand generates tactile feedback information that is signaled via direct electrical microstimulation of another region of the animal's cortex."

The researchers say that during the tests, the combined electrical activity of populations of 50 to 200 neurons in the monkey's motor cortex controlled the steering of the virtual arm, while thousands of neurons in the primary tactile cortex were simultaneously receiving continuous electrical feedback from the virtual hand's palm that allowed the monkeys to discriminate between objects based purely on the texture.

The texture of the virtual objects was expressed as a pattern of minute electrical signals transmitted to the monkey's brains, with three different electrical patterns corresponding to each of the three different object textures. One monkey took nine tries to learn how to select the correct object during each trial, while the other monkey took just four. The researchers conducted several tests to confirm that the monkeys were actually sensing the object and not just selecting them randomly.

"Such an interaction between the brain and a virtual avatar was totally independent of the animal's real body, because the animals did not move their real arms and hands, nor did they use their real skin to touch the objects and identify their texture," added Nicolelis. "It's almost like creating a new sensory channel through which the brain can resume processing information that cannot reach it anymore through the real body and peripheral nerves."

Nicolelis says the remarkable success seen with non-human primates suggests that humans could accomplish the same task much more easily, which means it should be possible to create a robotic exoskeleton that could allow severely paralyzed patients to walk while receiving tactile feedback. Such an exoskeleton would be directly controlled by the wearer's brain activity, allowing them to move autonomously. Meanwhile, sensors distributed across the exoskeleton would provide tactile feedback to allow the patient's brain to identify the texture, shape and temperature of objects.

The researchers recently set themselves the goal of carrying out the first public display of such an autonomous exoskeleton during the opening game of the 2014 FIFA Soccer World Cup to be held in Brazil.

The Duke University and Ecole Polytechnic Federale de Lausanne (EPFL) team's study was published in the journal Nature on October 5, 2011.

Here's some video showing the virtual arm controlled by a monkey selecting objects based on their virtual texture.

About the Author
Darren Quick Darren's love of technology started in primary school with a Nintendo Game & Watch Donkey Kong (still functioning) and a Commodore VIC 20 computer (not still functioning). In high school he upgraded to a 286 PC, and he's been following Moore's law ever since. This love of technology continued through a number of university courses and crappy jobs until 2008, when his interests found a home at Gizmag.   All articles by Darren Quick
Tags
8 Comments

Great News.

With the exception of amputees I don't think the exoskeleton is really necessary even if sensor equipped clothing is.

Slowburn
7th October, 2011 @ 12:21 am PDT

Interesting how they don't show us the monkeys with wires coming out of their brains. I don't know about this research from an ethical standpoint. :(

Leanne Franson
7th October, 2011 @ 03:11 am PDT

deus-ex human augmentation is coming :) and it's welcomed

Manuel Borbely
7th October, 2011 @ 03:46 am PDT

re; Leanne Franson

Given the number of morons that will take seeing wires coming out of the animals head as proof of mistreatment it is no surprise that they don't show it. If you are worried about the ethics of these tests, don't use any of the medical advancements of the last century.

Even if the research lab is run by a sadist the animals are not mistreated because because the stress screws the results. The fact that they are achieving technical breakthroughs at this level is evidence of the care given to the animals.

Slowburn
7th October, 2011 @ 07:39 am PDT

taking this a step further..... why can't I have 3 or 4 hands/arms?

aschmitt
7th October, 2011 @ 11:59 am PDT

Similar to the experiments back in the 70's where electrodes were placed on the fingers, toes and tongue of humans and tracked to the area of brain activity being used. They were able to control a computer, by thinking about moving the part of the body that the electrode was attached to. The signal generated by the brain, was sent to control a keyboard on a screen, the keyboard/computer controlled a robot, the robot brought the paralyzed person medical assistance in some form.

"Thinking" robots into action has been in R&D for at least 50 years now. Funny that industry and the military have not furthered these experiments for their use in manufacturing and war.

electric38
7th October, 2011 @ 01:06 pm PDT

I wonder if something could help victims that suffered a massive stroke/heart attack, that lost some or all motor issues with certain limbs?

Adam Ackels
8th October, 2011 @ 07:18 am PDT

There's a much lower-tech version of the same idea, with no implants. Pressure and temp pads (e.g., on fingers or feet) of paralyzed (or amputated, using prosthetics) people are linked to lightweight variable buzzers in a pattern on patches of "live" skin. By watching as various hot/warm/cool/cold hard/medium/soft objects touch the pad areas, the brain quickly learns to "feel" them. The sensation is the same as from live flesh.

A side benefit is that it stops "phantom pain" from amputations and paralyzed appendages; having something real to work with stops the relevant brain areas from reacting to random twitchy signals from neighboring tissue.

Brian Hall
9th October, 2011 @ 11:16 am PDT
Post a Comment

Login with your gizmag account:

Or Login with Facebook:


Related Articles

Just enter your friends and your email address into the form below

For multiple addresses, separate each with a comma




Privacy is safe with us because we have a strict privacy policy.

Looking for something? Search our 26,500 articles