Robotics

Quadriplegic woman gets chocolate fix using thought-controlled robotic arm

Quadriplegic woman gets chocolate fix using thought-controlled robotic arm
Quadriplegic Jan Scheuermann prepares to take a bite out of a chocolate bar she is guiding into her mouth with a thought-controlled robot arm while research assistant Brian Wodlinger, Ph.D., watches on (Photo: UPMC)
Quadriplegic Jan Scheuermann prepares to take a bite out of a chocolate bar she is guiding into her mouth with a thought-controlled robot arm while research assistant Brian Wodlinger, Ph.D., watches on (Photo: UPMC)
View 7 Images
1/7
Quadriplegic Jan Scheuermann has been able to feed herself chocolate using a robotic arm directly controlled by her thoughts (Photo: UPMC)
2/7
Quadriplegic Jan Scheuermann has been able to feed herself chocolate using a robotic arm directly controlled by her thoughts (Photo: UPMC)
Quadriplegic Jan Scheuermann prepares to take a bite out of a chocolate bar she is guiding into her mouth with a thought-controlled robot arm while research assistant Brian Wodlinger, Ph.D., watches on (Photo: UPMC)
3/7
Quadriplegic Jan Scheuermann prepares to take a bite out of a chocolate bar she is guiding into her mouth with a thought-controlled robot arm while research assistant Brian Wodlinger, Ph.D., watches on (Photo: UPMC)
Jan Scheuermann, who has quadriplegia, takes a bite out of a chocolate bar she has guided into her mouth with a thought-controlled robot arm (Photo: UPMC)
4/7
Jan Scheuermann, who has quadriplegia, takes a bite out of a chocolate bar she has guided into her mouth with a thought-controlled robot arm (Photo: UPMC)
Researcher Andrew Schwartz, Ph.D., shakes Jan Scheuermann’s robot hand, which she calls Hector (Photo: UPMC)
5/7
Researcher Andrew Schwartz, Ph.D., shakes Jan Scheuermann’s robot hand, which she calls Hector (Photo: UPMC)
Jan Scheuermann stacks cones with a mind-controlled robot arm (Photo: UPMC)
6/7
Jan Scheuermann stacks cones with a mind-controlled robot arm (Photo: UPMC)
The prosthetic arm, designed by the John Hopkins University's Applied Physics Laboratory (JHU/APL) and funded by the U.S. Department of Defense's Defense Advanced Research Projects Agency (DARPA) (Photo: DARPA and JHU/APL)
7/7
The prosthetic arm, designed by the John Hopkins University's Applied Physics Laboratory (JHU/APL) and funded by the U.S. Department of Defense's Defense Advanced Research Projects Agency (DARPA) (Photo: DARPA and JHU/APL)
View gallery - 7 images

Earlier this year, a 58 year-old woman who had lost the use of her limbs was successfully able to drink a cup of coffee by herself using a robotic arm controlled by her thoughts via a brain computer interface (BCI). Now, in a separate study, another woman with longstanding quadriplegia has been able to feed herself a chocolate bar using a mind-controlled, human-like robot arm offering what researchers claim is a level of agility and control approaching that of a human limb

University of Pittsburgh School of Medicine and University of Pittsburgh Medical Center (UPMC) developed the system that was tested by Jan Scheuermann, 52, from Pittsburgh. A mother of two, she was diagnosed 14 years ago with spinocerebellar degeneration, a degenerative brain disorder that left her paralyzed from the neck down.

UPMC neurosurgeon Elizabeth Tyler-Kabara, who is also an assistant professor at the Department of Neurological Surgery, Pitt School of Medicine, placed two electrode grids with 96 tiny contact points into regions of Schneuermann’s motor cortex that controlled right arm and hand movements.

“Prior to surgery, we conducted functional imaging tests of the brain to determine exactly where to put the two grids,” Tyler-Kabara said. “Then we used imaging technology in the operating room to guide placement of the grids, which have points that penetrate the brain’s surface by about one-sixteenth of an inch.”

These electrodes, picking up signals from individual neurons, were connected to a robotic hand powered by a computer running algorithms that detected real or imagined movements, like lifting an arm or rotating a wrist. The signals were then translated into instructions for the robotic arm, mimicking the way an unimpaired brain sends signals to move limbs.

Before the end of three months, Scheuermann was able to maneuver the hand, which she called Hector, accurately and precisely. With the ability to flex the wrist back and forth, move it from side to side and rotate it clockwise and counter-clockwise, as well as grip objects, the system offers what the researchers refer to as control in seven dimensions (7D). Using her mind, Scheuermann was able to instruct Hector to pick up blocks, tubes and a ball and put them in a tray.

Jan Scheuermann stacks cones with a mind-controlled robot arm (Photo: UPMC)
Jan Scheuermann stacks cones with a mind-controlled robot arm (Photo: UPMC)

After some practice, she was able to perform the movements to 91.6 percent accuracy. And she could do it 30 seconds faster than at the start of trial. Researchers found this to be clinically significant, opening the way for future developments in similar prostheses and further innovations that would change the lives of the disabled.

Using this technique, Scheuermann could feed herself chocolate – a goal she had voiced at the start of the trial. The researchers applauded her as she performed this feat less than a year later. “One small nibble for a woman, one giant bite for BCI,” Scheuermann quipped.

Professor Andrew Schwartz from the Department of Neurobiology at Pitt School of Medicine said the breakthrough was unique.

“This is a spectacular leap toward greater function and independence for people who are unable to move their own arms,” Schwartz said. “In developing mind-controlled prosthetics, one of the biggest challenges has always been how to translate brain signals that indicate limb movement into computer signals that can reliably and accurately control a robotic prosthesis.

"Most mind-controlled prosthetics have achieved this by an algorithm which involves working through a complex 'library' of computer-brain connections," Schwartz added. "However, we've taken a completely different approach here, by using a model-based computer algorithm which closely mimics the way that an unimpaired brain controls limb movement. The result is a prosthetic hand which can be moved far more accurately and naturalistically than previous efforts."

The next big step for the BCI technology could be to stimulate the brain to generate sensation by using a two-way electrode system. That would allow the user to "feel" objects and loosen their grip to pick up delicate ones or tighten it for a firmer grasp.

And after that, according to lead investigator Assistant Professor Jennifer Collinger, anything is possible. “It might even be possible to combine brain control with a device that directly stimulates muscles to restore movement of the individual’s own limb,” she said.

The team's study is published in The Lancet.

Source: University of Pittsburgh School of Medicine

View gallery - 7 images
1 comment
1 comment
Ben Stewart
This is very informational post. It has helped my find something that I was looking for a long time on internet. It is so unique and great work. I will follow your posts in future.