Tags
  • Innovation and Research
Features & Articles

Adding sense of touch improves control of robotic arm

A portrait of Collinger

Most nondisabled people take their ability to perform simple daily tasks for granted. When they reach for a warm mug of coffee, for instance, they can feel its weight and temperature and adjust their grip accordingly so that no liquid is spilled. People with full sensory and motor control of their arms and hands can know that they’ve made contact with an object the instant they touch or grasp it, allowing them to start moving or lifting it with confidence.

But those tasks become much more difficult when a person operates a prosthetic arm—let alone a mind-controlled one.

In a paper published today in Science, a team of bioengineers from the University of Pittsburgh Rehab Neural Engineering Labs describes how adding brain stimulation that evokes tactile sensations makes it easier for the operator to manipulate a brain-controlled robotic arm. In the experiment, supplementing vision with artificial tactile perception cut the time spent grasping and transferring objects in half, from a median time of 20.9 to 10.2 seconds.

“In a sense, this is what we hoped would happen—but perhaps not to the degree that we observed,” said co-senior author Jennifer Collinger, pictured, associate professor in Pitt’s Department of Physical Medicine and Rehabilitation. “Sensory feedback from limbs and hands is hugely important for doing normal things in our daily lives, and when that feedback is lacking, people’s performance is impaired.”

Study participant Nathan Copeland, whose progress was described in the paper, was the first person in the world to be implanted with tiny electrode arrays not just in his brain’s motor cortex but also in his somatosensory cortex—a region of the brain that processes sensory information from the body. Arrays allow him to control the robotic arm with his mind as well as to receive tactile sensory feedback, which is similar to how neural circuits operate when a person’s spinal cord is intact.

“I was already extremely familiar with both the sensations generated by stimulation and performing the task without stimulation. Even though the sensation isn’t ‘natural’—it feels like pressure and gentle tingle—that never bothered me,” said Copeland. “There wasn't really any point where I felt like stimulation was something I had to get used to. Doing the task while receiving the stimulation just went together like PB&J.”

After a car crash that left him with limited use of his arms, Copeland enrolled in a clinical trial testing the sensorimotor microelectrode brain-computer interface (BCI) and was implanted with four microelectrode arrays developed by Blackrock Microsystems (also commonly referred to as Utah arrays).

This paper is a step forward from an earlier study that described for the first time how stimulating sensory regions of the brain using tiny electrical pulses can evoke sensation in distinct regions of a person’s hand, even though they had lost feeling in their limbs due to spinal cord injury. In this new study, the researchers combined reading the information from the brain to control the movement of the robotic arm with writing information back in to provide sensory feedback.

In a series of tests in which the BCI operator was asked to pick up and transfer various objects from a table to a raised platform, providing tactile feedback through electrical stimulation allowed the participant to complete tasks twice as fast compared to tests without stimulation.

In the new paper, the researchers wanted to test the effect of sensory feedback in conditions that would resemble the real world as closely as possible.

“We didn’t want to constrain the task by removing the visual component of perception,” said co-senior author Robert Gaunt, associate professor in the Department of Physical Medicine and Rehabilitation. “When even limited and imperfect sensation is restored, the person’s performance improved in a pretty significant way. We still have a long way to go in terms of making the sensations more realistic and bringing this technology to people’s homes, but the closer we can get to recreating the normal inputs to the brain, the better off we will be.”

Additional authors of this study include Sharlene Flesher, Jeffrey Weiss, Christopher Hughes, Angelica Herrera and Michael Boninger of Pitt; John Downey of the University of Chicago; and Elizabeth Tyler-Kabara of the University of Texas at Austin.

This work was supported by the Defense Advanced Research Projects Agency and Space and Naval Warfare Systems Center Pacific.