Washington Post | New Robotic Hand Named After Luke Skywalker Helps Amputee Touch and Feel Again

New Robotic Hand Named After Luke Skywalker Helps Amputee Touch and Feel Again

 15 November 2017

A volunteer in the experiment clasps his hands together and can feel one hand with the other for the first time since his left hand was amputated. (University of Utah)

Keven Walgamott wasn’t sure what to expect when scientists first hooked up what was left of his arm to a computer.

Last year — 14 years after he lost his hand and part of his arm in an electrical accident — he heard about a team at the University of Utah working on an experimental robotic arm. The prosthetic hand and fingers would be controlled by an amputee’s own nerves. Even more challenging, researchers were trying to restore the sense of touch to amputees through that robotic hand.

Walgamott volunteered for the experimental program. A few weeks after surgeons implanted electrodes into the nerves of his arm last year, he found himself hooked up to a computer getting ready to touch something with his left hand for the first time in more than a decade.

The Utah researchers had created a computer program to simulate the feel of touching a virtual wall — an early test to prepare Walgamott for the robotic arm.

As Walgamott moved his arm, a virtual hand on the computer screen before him moved as well, plunking down the ridges of the corrugated wall.

“It was stunning. I could actually feel the wall. I could feel the bumps along it,” he said. “It almost brought tears to my eyes.”

Researchers at the University of Utah are developing an experimental robotic arm that allows amputees to control it using their own nerves. (University of Utah)

Then researchers attached the robotic arm itself, putting Walgamott through a battery of tests over 14 months that had him touch and manipulate objects with it.

“When I went to grab something, I could feel myself grabbing it. When I thought about moving this or that finger, it would move almost right away,” he said. “I don’t know how to describe it except that it was like I had a hand again.”

Using a robotic arm that allowed him to feel objects again, Keven Walgamott was able to pick a grape without crushing it. (University of Utah)

At the Society for Neuroscience conference in Washington on Tuesday, the University of Utah team presented part of their work on adding the sense of touch and movement to prostheses — the latest step in the rapidly developing field of neuroprosthetics.

Over the course of the past year, while working with Walgamott as their key subject, they have found adding touch to prostheses markedly improves motor skills of amputees compared with robotic prostheses on the market. Adding the sense of touch to prosthetic hands also appears to reduce a painful feeling many amputees experience called phantom pain, and it creates a sense of ownership over the device, researchers said.

“By adding sensory feedback, it becomes a closed-loop system that mimics biology,” said Jacob George, a bioengineering PhD student at the University of Utah and lead author of Tuesday’s study. The goal, he explained, is to get prosthetic technology to a point where someone using a prosthesis wouldn’t have to think through every movement to pick up a cup. They wouldn’t even have to look at the cup. They would simply move the hand toward it using their brain and existing nervous system, feel it and pick it up.

The most cutting-edge prosthetic hands available can make sophisticated movements, but they require complicated — and often imprecise — methods of operation. Some rely on tilt motions by the user’s foot and others on movements by the muscles remaining in a user’s arm.

The Utah research group’s approach, however, relies on a device called the Utah Slanted Electrode Array. The device is implanted directly into the nerves in a subject’s arm. The USEA, along with electrodes implanted in muscles, allows amputees to control a robotic hand as if they were flexing or moving their original hand. The approach also allows signals like sensation to be transmitted back to the subject’s nervous system, creating a “looped system” — like in a human limb — where the hand’s feeling and movements inform each other.

“We often think of touch as one thing, but it’s more than that. It’s pressure, vibration, temperature, pain,” said Gregory Clark, the bioengineering professor leading the Utah research team. Because of that, it has required painstakingly slow work from a multidisciplinary team of experts — over the course of years — to build those sensations into the robotic arm, figure out which spot on the hand corresponds with which nerve fiber in the arm and the algorithms required to send touch signals back into the nervous system.

University of Utah researchers have developed technology that allows users to feel through this robotic arm. In one experiment, they were able to use the hand to distinguish soft foam from hard plastic. (University of Utah)

Clark’s team is part of a larger effort funded by the U.S. military’s Defense Advanced Research Projects Agency. DARPA launched its neuroprosthetic program in 2014 — called HAPTIX — with the goal of developing an advanced robotic arm within years that would help amputees feel and move intuitively. The researchers received additional funding from National Science Foundation.

The robotic arm the Utah researchers have been working with was developed under the HAPTIX program by the company DEKA (the company founded by Segway inventor Dean Kamen). The state-of-the-art robotic limb was dubbed the “Luke” arm by its makers, after the advanced prosthesis wielded by Luke Skywalker in “Star Wars.”

The “Luke” arm, a robotic prosthetic created by DEKA and named after the sci-fi robotic hand wielded by Luke Skywalker. (University of Utah)

The results of the Utah group’s experimental tests so far have been both gratifying and inspiring, the researchers said.

Walgamott — a real estate agent in Utah — described the joy of being able to do everyday mundane tasks again with his left hand — like picking up an egg without crushing it, clasping his hands together and holding his wife’s hand.

The experimental robotic arm allowed Keven Walgamott to hold and feel his wife’s hand again. (University of Utah)

But the highlight of his entire 14 months in the experimental program, he said, was being able to put a pillow into a pillowcase on his own.

“When you have just one hand, you learn to adapt,” he said, describing the infuriatingly slow process he usually uses for pillowcases, pulling them on inch by inch on each side, rotating the whole time. “To just take a pillow in one hand and put the pillowcase on with the other. I know it sounds simple, but it’s amazing.”


Warning: Trying to access array offset on value of type null in /home3/blackry1/public_html/wp-content/themes/edesign/single.php on line 131

You might also like

Introducing The BCI Exhibit

Jessica Nani

The American Association for the Advancement of Science (AAAS) and Blackrock Neurotech announced today the first-ever brain-computer interface (BCI) art exhibit, to be displayed at AAAS headquarters in Washington, D.C. The BCI Exhibit will feature works created by patients with paralysis using thought-to-cursor implantable brain-computer interface technology made possible by Blackrock.

Medical Design and Outsourcing | Blackrock Neurotech unveils next-generation BCI interface

Jessica Nani

Blackrock Neurotech announced today that it revealed its Neuralace next-generation neural interface for brain-computer interface (BCI) technology.

Danny in the Valley | Blackrock Neurotech’s Marcus Gerhardt: “An inflection point for brain-computer interfaces”

Jessica Nani

The Sunday Times’ tech correspondent Danny Fortson brings on Marcus Gerhardt, chief executive of Blackrock Neurotech, to talk about his […]

Medical Design and Outsourcing | How the Utah Array is advancing BCI science

Jessica Nani

By Florian Solzbacher Brain-computer interface (BCI) science has seen exciting advances and heightened public attention in recent years, and for […]

Physics World | Brain–computer interfaces: tailoring neurotechnology to improve patients’ lives

Jessica Nani

By Tami Freeman Sumner Norman, chief neuroscientist at AE Studio, talks to Tami Freeman about the company’s work in brain–computer […]

+ Mass Device | 7 brain-computer interface companies you need to know

Jessica Nani

By Sean Whooley Blackrock Neurotech, BrainGate, ClearPoint Neuro, Neuralink, Synchron and more race to bring brain-computer interface (BCI) tech to […]

Verdict | Why is Elon Musk So Excited About Brain-Computer Interfaces?

Jessica Nani

By Jake Mainwaring Brain-computer interfacing (BCI) is like something out of a sci-fi movie. The idea of hooking up a […]

WIRED | This Man Set the Record for Wearing a Brain-Computer Interface

Jessica Nani

By Emily Mullin NATHAN COPELAND CONSIDERS himself a cyborg. The 36-year-old has lived with a brain-computer interface for more than […]

Insider | Linking Your Brain to a Computer Will Soon Be Real

Jessica Nani

By Adam Rogers  For years — decades, even — news accounts and scientific journals have featured videos of human beings […]