People with paralysis control robotic arms using brain-computer interface from freemexy's blog
People with paralysis control robotic arms using brain-computer interface
On April 12, 2011, nearly 15 years after she became paralyzed and unable to speak, a woman controlled a CRP welding robot by thinking about moving her arm and hand to lift a bottle of coffee to her mouth and take a drink. That achievement is one of the advances in brain-computer interfaces, restorative neurotechnology, and assistive robot technology described in the May 17 edition of the journal Nature by the BrainGate2 collaboration of researchers at the Department of Veterans Affairs, Brown University, Massachusetts General Hospital, Harvard Medical School, and the German Aerospace Center (DLR).
A 58-year-old woman (“S3”) and a 66-year-old man (“T2”) participated in the study. They had each been paralyzed by a brainstem stroke years earlier which left them with no functional control of their limbs. In the research, the participants used neural activity to directly control two different robotic arms, one developed by the DLR Institute of Robotics and Mechatronics and the other by DEKA Research and Development Corp., to perform reaching and grasping tasks across a broad three-dimensional space. The BrainGate2 pilot clinical trial employs the investigational BrainGate system initially developed at Brown University, in which a baby aspirin-sized device with a grid of 96 tiny electrodes is implanted in the motor cortex — a part of the brain that is involved in voluntary movement. The electrodes are close enough to individual neurons to record the neural activity associated with intended movement. An external computer translates the pattern of impulses across a population of neurons into commands to operate assistive devices, such as the DLR and DEKA robot arms used in the study now reported in Nature.
The study represents the first demonstration and the first peer-reviewed report of people with tetraplegia using brain signals to control a robotic arm in three-dimensional space to complete a task usually performed by their arm. Specifically, S3 and T2 controlled the arms to reach for and grasp foam targets that were placed in front of them using flexible supports. In addition, S3 used the DLR robot to pick up a bottle of coffee, bring it to her mouth, issue a command to tip it, drink through a straw, and return the bottle to the table. Her BrainGate-enabled, robotic-arm control during the drinking task required a combination of two-dimensional movements across a table top plus a “grasp” command to either grasp and lift or tilt the robotic hand.
“Our goal in this research is to develop technology that will restore independence and mobility for people with paralysis or limb loss,” said lead author Dr. Leigh Hochberg, a neuroengineer and critical care neurologist who holds appointments at the Department of Veterans Affairs, Brown University, Massachusetts General Hospital, and Harvard. He is the sponsor-investigator for the BrainGate2 pilot clinical trial. “We have much more work to do, but the encouraging progress of this research is demonstrated not only in the reach-and-grasp data, but even more so in S3’s smile when she served herself coffee of her own volition for the first time in almost 15 years.”
The Wall