Caltech | Watch Paralyzed Man Move Robotic Arm with his Mind

Watch Paralyzed Man Move Robotic Arm with his Mind

22 May 2015

“I joke around with the guys that I want to be able to drink my own beer—to be able to take a drink at my own pace, when I want to take a sip out of my beer and to not have to ask somebody to give it to me. I really miss that independence,” says Erik G. Sorto. (Credit: Caltech)

A man who is paralyzed from the neck down can now move a robotic arm just by thinking about it.

Neural prosthetic devices implanted in the brain’s movement center, the motor cortex, have allowed patients with amputations or paralysis to control the movement of a robotic limb—one is either connected to or separate from the patient’s own limb.

But, current neuroprosthetics produce motion that is delayed and jerky—not the smooth and seemingly automatic gestures associated with natural movement.

Now, by implanting neuroprosthetics in a part of the brain that controls not the movement directly but rather our intent to move, researchers have developed a way to produce more natural and fluid motions. Their findings are reported in the journal Science.

Rock, paper, scissors

In a clinical trial, researchers successfully implanted such a device in a patient with quadriplegia, giving him the ability to perform a fluid hand-shaking gesture and even play “rock, paper, scissors” using a separate robotic arm.

“When you move your arm, you really don’t think about which muscles to activate and the details of the movement—such as lift the arm, extend the arm, grasp the cup, close the hand around the cup, and so on,” says Richard Andersen, professor of neuroscience at the California Institute of Technology.

“Instead, you think about the goal of the movement. For example, ‘I want to pick up that cup of water.’ So in this trial, we were successfully able to decode these actual intents, by asking the subject to simply imagine the movement as a whole, rather than breaking it down into myriad components.”

For example, the process of seeing a person and then shaking his hand begins with a visual signal (for example, recognizing someone you know) that is first processed in the lower visual areas of the cerebral cortex. The signal then moves up to a high-level cognitive area known as the posterior parietal cortex (PPC). Here, the initial intent to make a movement is formed. These intentions are then transmitted to the motor cortex, through the spinal cord, and on to the arms and legs where the movement is executed.

Simpler intent

High spinal cord injuries can cause quadriplegia in some patients because movement signals cannot get from the brain to the arms and legs. As a solution, earlier neuroprosthetic implants used tiny electrodes to detect and record movement signals at their last stop before reaching the spinal cord: the motor cortex.

The recorded signal is then carried via wire bundles from the patient’s brain to a computer, where it is translated into an instruction for a robotic limb. However, because the motor cortex normally controls many muscles, the signals tend to be detailed and specific.

Researchers wanted to see if the simpler intent to shake the hand could be used to control the prosthetic limb, instead of asking the subject to concentrate on each component of the handshake—a more painstaking and less natural approach.

Andersen and colleagues wanted to improve the versatility of movement that a neuroprosthetic can offer by recording signals from a different brain region—the PPC.

“The PPC is earlier in the pathway, so signals there are more related to movement planning—what you actually intend to do—rather than the details of the movement execution,” he says. “We hoped that the signals from the PPC would be easier for the patients to use, ultimately making the movement process more intuitive. Our future studies will investigate ways to combine the detailed motor cortex signals with more cognitive PPC signals to take advantage of each area’s specializations.”

Intuitive motion

In a clinical trial, designed to test the safety and effectiveness of the new approach, the Caltech team collaborated with surgeons at Keck Medicine at the University of Southern California and the rehabilitation team at Rancho Los Amigos National Rehabilitation Center.

The surgeons implanted a pair of small electrode arrays in two parts of the PPC of the quadriplegic patient. Each array contains 96 active electrodes that, in turn, each record the activity of a single neuron in the PPC. The arrays were connected by a cable to a system of computers that processed the signals, decoded the intent of the subject, and controlled output devices that included a computer cursor and a robotic arm developed by collaborators at Johns Hopkins University.

After recovering from the surgery, the patient was trained to control the computer cursor and the robotic arm with his mind. Once training was complete, the researchers saw just what they were hoping for: intuitive movement of the robotic arm.

“For me, the most exciting moment of the trial was when the participant first moved the robotic limb with his thoughts. He had been paralyzed for over 10 years, and this was the first time since his injury that he could move a limb and reach out to someone. It was a thrilling moment for all of us,” Andersen says.

“It was a big surprise that the patient was able to control the limb on day one—the very first day he tried,” he adds. “This attests to how intuitive the control is when using PPC activity.”

High fives

The patient, Erik G. Sorto, was also thrilled with the quick results: “I was surprised at how easy it was,” he says. “I remember just having this out-of-body experience, and I wanted to just run around and high-five everybody.”

Over time, Sorto continued to refine his control of his robotic arm, thus providing the researchers with more information about how the PPC works. For example, “we learned that if he thought, ‘I should move my hand over toward to the object in a certain way’—trying to control the limb—that didn’t work,” Andersen says. “The thought actually needed to be more cognitive. But if he just thought, ‘I want to grasp the object,’ it was much easier. And that is exactly what we would expect from this area of the brain.”

This better understanding of the PPC will help researchers improve neuroprosthetic devices of the future, Andersen says. “What we have here is a unique window into the workings of a complex high-level brain area as we work collaboratively with our subject to perfect his skill in controlling external devices.”

“In taking care of patients with neurological injuries and diseases—and knowing the significant limitations of current treatment strategies—it is clear that completely new approaches are necessary to restore function to paralyzed patients,” says Charles Y. Liu, professor of neurological surgery, neurology, and biomedical engineering at USC. “Direct brain control of robots and computers has the potential to dramatically change the lives of many people.”

Quality of life

Advancements in prosthetics like these hold promise for the future of patient rehabilitation, says Mindy Aisen, the chief medical officer at Rancho Los Amigos who led the study’s rehabilitation team.

Although tasks like shaking hands and playing “rock, paper, scissors” are important to demonstrate the capability of these devices, the hope is that neuroprosthetics will eventually enable patients to perform more practical tasks that will allow them to regain some of their independence.

“This study has been very meaningful to me. As much as the project needed me, I needed the project,” Sorto says. “The project has made a huge difference in my life. It gives me great pleasure to be part of the solution for improving paralyzed patients’ lives.

“I joke around with the guys that I want to be able to drink my own beer—to be able to take a drink at my own pace, when I want to take a sip out of my beer and to not have to ask somebody to give it to me. I really miss that independence. I think that if it was safe enough, I would really enjoy grooming myself—shaving, brushing my own teeth. That would be fantastic.”

[related]

To that end, the researchers are already working on a strategy that could enable patients to perform these finer motor skills. The key is to be able to provide particular types of sensory feedback from the robotic arm to the brain.

Although Sorto’s implant allowed him to control larger movements with visual feedback, “to really do fine dexterous control, you also need feedback from touch,” Andersen says. “Without it, it’s like going to the dentist and having your mouth numbed. It’s very hard to speak without somatosensory feedback.” The newest devices under development feature a mechanism to relay signals from the robotic arm back into the part of the brain that gives the perception of touch.

“The reason we are developing these devices is that normally a quadriplegic patient couldn’t, say, pick up a glass of water to sip it, or feed themselves. They can’t even do anything if their nose itches. Seemingly trivial things like this are very frustrating for the patients,” Andersen says. “This trial is an important step toward improving their quality of life.”

The implanted device and signal processors used in the Caltech-led clinical trial were the NeuroPort Array and NeuroPort Bio-potential Signal Processors developed by Blackrock Microsystems in Salt Lake City, Utah. The robotic arm used in the trial was the Modular Prosthetic Limb, developed at the Applied Physics Laboratory at Johns Hopkins. Sorto was recruited to the trial by collaborators at Rancho Los Amigos National Rehabilitation Center and at Keck Medicine of USC.

The National Institutes of Health, the Boswell Foundation, the Department of Defense, and the USC Neurorestoration Center funded the work.

Source: Caltech

Original Study DOI: 10.1126/science.aaa5417


Warning: Trying to access array offset on value of type null in /home3/blackry1/public_html/wp-content/themes/edesign/single.php on line 131

You might also like

Introducing The BCI Exhibit

Jessica Nani

The American Association for the Advancement of Science (AAAS) and Blackrock Neurotech announced today the first-ever brain-computer interface (BCI) art exhibit, to be displayed at AAAS headquarters in Washington, D.C. The BCI Exhibit will feature works created by patients with paralysis using thought-to-cursor implantable brain-computer interface technology made possible by Blackrock.

Medical Design and Outsourcing | Blackrock Neurotech unveils next-generation BCI interface

Jessica Nani

Blackrock Neurotech announced today that it revealed its Neuralace next-generation neural interface for brain-computer interface (BCI) technology.

Danny in the Valley | Blackrock Neurotech’s Marcus Gerhardt: “An inflection point for brain-computer interfaces”

Jessica Nani

The Sunday Times’ tech correspondent Danny Fortson brings on Marcus Gerhardt, chief executive of Blackrock Neurotech, to talk about his […]

Medical Design and Outsourcing | How the Utah Array is advancing BCI science

Jessica Nani

By Florian Solzbacher Brain-computer interface (BCI) science has seen exciting advances and heightened public attention in recent years, and for […]

Physics World | Brain–computer interfaces: tailoring neurotechnology to improve patients’ lives

Jessica Nani

By Tami Freeman Sumner Norman, chief neuroscientist at AE Studio, talks to Tami Freeman about the company’s work in brain–computer […]

+ Mass Device | 7 brain-computer interface companies you need to know

Jessica Nani

By Sean Whooley Blackrock Neurotech, BrainGate, ClearPoint Neuro, Neuralink, Synchron and more race to bring brain-computer interface (BCI) tech to […]

Verdict | Why is Elon Musk So Excited About Brain-Computer Interfaces?

Jessica Nani

By Jake Mainwaring Brain-computer interfacing (BCI) is like something out of a sci-fi movie. The idea of hooking up a […]

WIRED | This Man Set the Record for Wearing a Brain-Computer Interface

Jessica Nani

By Emily Mullin NATHAN COPELAND CONSIDERS himself a cyborg. The 36-year-old has lived with a brain-computer interface for more than […]

Insider | Linking Your Brain to a Computer Will Soon Be Real

Jessica Nani

By Adam Rogers  For years — decades, even — news accounts and scientific journals have featured videos of human beings […]