Decision time? Check out our latest product comparisons

Quadriplegic woman gets chocolate fix using thought-controlled robotic arm

By

December 18, 2012

Quadriplegic Jan Scheuermann prepares to take a bite out of a chocolate bar she is guiding...

Quadriplegic Jan Scheuermann prepares to take a bite out of a chocolate bar she is guiding into her mouth with a thought-controlled robot arm while research assistant Brian Wodlinger, Ph.D., watches on (Photo: UPMC)

Image Gallery (6 images)

Earlier this year, a 58 year-old woman who had lost the use of her limbs was successfully able to drink a cup of coffee by herself using a robotic arm controlled by her thoughts via a brain computer interface (BCI). Now, in a separate study, another woman with longstanding quadriplegia has been able to feed herself a chocolate bar using a mind-controlled, human-like robot arm offering what researchers claim is a level of agility and control approaching that of a human limb

University of Pittsburgh School of Medicine and University of Pittsburgh Medical Center (UPMC) developed the system that was tested by Jan Scheuermann, 52, from Pittsburgh. A mother of two, she was diagnosed 14 years ago with spinocerebellar degeneration, a degenerative brain disorder that left her paralyzed from the neck down.

UPMC neurosurgeon Elizabeth Tyler-Kabara, who is also an assistant professor at the Department of Neurological Surgery, Pitt School of Medicine, placed two electrode grids with 96 tiny contact points into regions of Schneuermann’s motor cortex that controlled right arm and hand movements.

“Prior to surgery, we conducted functional imaging tests of the brain to determine exactly where to put the two grids,” Tyler-Kabara said. “Then we used imaging technology in the operating room to guide placement of the grids, which have points that penetrate the brain’s surface by about one-sixteenth of an inch.”

These electrodes, picking up signals from individual neurons, were connected to a robotic hand powered by a computer running algorithms that detected real or imagined movements, like lifting an arm or rotating a wrist. The signals were then translated into instructions for the robotic arm, mimicking the way an unimpaired brain sends signals to move limbs.

Before the end of three months, Scheuermann was able to maneuver the hand, which she called Hector, accurately and precisely. With the ability to flex the wrist back and forth, move it from side to side and rotate it clockwise and counter-clockwise, as well as grip objects, the system offers what the researchers refer to as control in seven dimensions (7D). Using her mind, Scheuermann was able to instruct Hector to pick up blocks, tubes and a ball and put them in a tray.

Jan Scheuermann stacks cones with a mind-controlled robot arm (Photo: UPMC)

After some practice, she was able to perform the movements to 91.6 percent accuracy. And she could do it 30 seconds faster than at the start of trial. Researchers found this to be clinically significant, opening the way for future developments in similar prostheses and further innovations that would change the lives of the disabled.

Using this technique, Scheuermann could feed herself chocolate – a goal she had voiced at the start of the trial. The researchers applauded her as she performed this feat less than a year later. “One small nibble for a woman, one giant bite for BCI,” Scheuermann quipped.

Professor Andrew Schwartz from the Department of Neurobiology at Pitt School of Medicine said the breakthrough was unique.

“This is a spectacular leap toward greater function and independence for people who are unable to move their own arms,” Schwartz said. “In developing mind-controlled prosthetics, one of the biggest challenges has always been how to translate brain signals that indicate limb movement into computer signals that can reliably and accurately control a robotic prosthesis.

"Most mind-controlled prosthetics have achieved this by an algorithm which involves working through a complex 'library' of computer-brain connections," Schwartz added. "However, we've taken a completely different approach here, by using a model-based computer algorithm which closely mimics the way that an unimpaired brain controls limb movement. The result is a prosthetic hand which can be moved far more accurately and naturalistically than previous efforts."

The next big step for the BCI technology could be to stimulate the brain to generate sensation by using a two-way electrode system. That would allow the user to "feel" objects and loosen their grip to pick up delicate ones or tighten it for a firmer grasp.

And after that, according to lead investigator Assistant Professor Jennifer Collinger, anything is possible. “It might even be possible to combine brain control with a device that directly stimulates muscles to restore movement of the individual’s own limb,” she said.

The team's study is published in The Lancet.

Source: University of Pittsburgh School of Medicine

About the Author
Leon Gettler An award winning author and freelance journalist with a strong background in newspapers, magazines and podcasts, Leon is passionately drawn to all things innovative and unknown with a deep interest in telecommunications, environmental technology and design. When not indulging his passion for reading and writing, he can be found memorizing lines immortalized by Gerry Mulligan on baritone sax. He lives in Melbourne, Australia.   All articles by Leon Gettler
1 Comment

This is very informational post. It has helped my find something that I was looking for a long time on internet. It is so unique and great work. I will follow your posts in future.

Ben Stewart
19th December, 2012 @ 03:16 am PST
Post a Comment

Login with your gizmag account:

Or Login with Facebook:


Related Articles
Looking for something? Search our 29,135 articles