In a development that could have huge implications for quadriplegics, paraplegics and those with prosthetic limbs, researchers from Duke University and the Ecole Polytechnic Federale de Lausanne (EPFL) have developed technology that has allowed monkeys to control a virtual arm and touch and feel virtual objects using only their brain activity. The researchers say it is the first-ever demonstration of a two-way interaction between a primate brain and a virtual body and could lead to robotic exoskeletons that not only that allows paralyzed patients to walk again, but to also feel the ground beneath them.
By inserting electrodes in the regions of the brain involved in the planning, control, and execution of voluntary movement - the motor cortex - and the area that processes input received from cells in the body that are sensitive to sensory experiences, including touch - the somatosensory cortex - the researchers were able to train two monkeys to use their electrical brain activity to direct the virtual hands of an avatar to the surface of visually identical virtual objects and, upon contact, differentiate them based on their textures.
"This is the first demonstration of a brain-machine-brain interface (BMBI) that establishes a direct, bidirectional link between a brain and a virtual body," said study leader Miguel Nicolelis, MD, PhD, professor of neurobiology at Duke University Medical Center and co-director of the Duke Center for Neuroengineering. "In this BMBI, the virtual body is controlled directly by the animal's brain activity, while its virtual hand generates tactile feedback information that is signaled via direct electrical microstimulation of another region of the animal's cortex."
The researchers say that during the tests, the combined electrical activity of populations of 50 to 200 neurons in the monkey's motor cortex controlled the steering of the virtual arm, while thousands of neurons in the primary tactile cortex were simultaneously receiving continuous electrical feedback from the virtual hand's palm that allowed the monkeys to discriminate between objects based purely on the texture.
The texture of the virtual objects was expressed as a pattern of minute electrical signals transmitted to the monkey's brains, with three different electrical patterns corresponding to each of the three different object textures. One monkey took nine tries to learn how to select the correct object during each trial, while the other monkey took just four. The researchers conducted several tests to confirm that the monkeys were actually sensing the object and not just selecting them randomly.
"Such an interaction between the brain and a virtual avatar was totally independent of the animal's real body, because the animals did not move their real arms and hands, nor did they use their real skin to touch the objects and identify their texture," added Nicolelis. "It's almost like creating a new sensory channel through which the brain can resume processing information that cannot reach it anymore through the real body and peripheral nerves."
Nicolelis says the remarkable success seen with non-human primates suggests that humans could accomplish the same task much more easily, which means it should be possible to create a robotic exoskeleton that could allow severely paralyzed patients to walk while receiving tactile feedback. Such an exoskeleton would be directly controlled by the wearer's brain activity, allowing them to move autonomously. Meanwhile, sensors distributed across the exoskeleton would provide tactile feedback to allow the patient's brain to identify the texture, shape and temperature of objects.
The researchers recently set themselves the goal of carrying out the first public display of such an autonomous exoskeleton during the opening game of the 2014 FIFA Soccer World Cup to be held in Brazil.
Here's some video showing the virtual arm controlled by a monkey selecting objects based on their virtual texture.