Robot arm controlled by quadriplegic’s intentions

California researchers have linked a robot arm to the brain of a quadriplegic man, giving him smooth control over the arm.

Erik Sorto and Caltech brain control

Quadriplegic Erik Sorto gives himself a drink, via electrodes in the part of his brain linked to intentions.

by Steve Bush, Electronics Weekly May 22, 2015

The electrodes are not in the motor cortex or attached to muscle nerves, but are in a part of the brain associated with planning muscle activity: the posterior parietal cortex, or PPC.

“When you move your arm, you really don’t think about which muscles to activate and the details of the movement – such as lift the arm, extend the arm, grasp the cup, close the hand around the cup, and so on. Instead, you think about the goal of the movement – ‘I want to pick up that cup of water,’ for example,” said Caltech professor Richard Anderson. “So in this trial, we were able to decode these actual intents, by asking the subject to simply imagine the movement as a whole, rather than breaking it down into myriad components.”

Caltech’s team shared the research with scientists from the University of Southern California.

According to Caltech, the process of recognising a person and then shaking hands begins with a visual signal that is first processed in the lower visual areas of the cerebral cortex. The signal then moves up to the PPC, a high-level cognitive area where the initial intent to make a movement is formed. These intentions are transmitted to the motor cortex, on through the spinal cord, and then to the muscles where movement is executed.

Implanting electrodes in the PPC, with appropriate computer processing, has given the patient “the ability to perform a fluid hand-shaking gesture and even play ‘rock, paper, scissors’ using a robotic arm,” said Caltech.

The patent, Eric Sorto, was shot in the neck more than a decade ago. Two arrays each of 96 micro-electrodes were implanted into his PPC in 2013. Functional magnetic resonance imaging his neurons to be monitored while Sorto imagined various types of limb and eye movements.

Based on the recorded neural activity it became possible for researchers to predict which limbs he wanted to move, where he wanted to move them, when, and how fast.

This information was then used to steer a computer cursor or to direct a robotic arm – the latter developed by Johns Hopkins University.

Beyond this, it was found that Sorto could alter the activity of neuron populations simply by imagining different motor actions.

Trying to control the limb directly with thoughts like ‘I should move my hand over toward to the object in a certain way’ didn’t work, while ‘I want to grasp the object,’ was more likely to succeed.

Visual feedback allows Sorto’s to control large movements, but touch feedback is needed for fine control, “Without touch, it’s like going to the dentist and having your mouth numbed. It’s very hard to speak without somatosensory feedback,” said Andersen – whose team is working on a mechanism to relay signals from the robotic arm back into the part of the brain that gives the perception of touch.

Future studies will also investigate ways to combine the more detailed and specialised signals available in the motor cortex signals with the PPC’s more cognitive signals to take advantage of each area’s specialisations.

The work is covered in a study published by Caltech lab members Tyson Aflalo, Spencer Kellis, Christian Klaes, Brian Lee, Ying Shi, Kelsie Pejsa and Richard Anderson in the May 22 edition of the journal Science.

Source Electronics Weekly

Neurophysiology. Decoding motor imagery from the posterior parietal cortex of a tetraplegic human, Aflalo T, Kellis S, Klaes C, Lee B, Shi Y, Pejsa K, Shanfield K, Hayes-Jackson S, Aisen M, Heck C, Liu C, Andersen RA. Science. 2015 May 22;348(6237):906-10. doi: 10.1126/science.aaa5417. Full text

  Further reading
Cognitive signals for brain-machine interfaces in posterior parietal cortex include continuous 3D trajectory commands, Hauschild M, Mulliken GH, Fineman I, Loeb GE, Andersen RA. Proc Natl Acad Sci USA. 2012 Oct 16;109(42):17075-80. doi: 10.1073/pnas.1215092109. Epub 2012. Full text

Parietal neural prosthetic control of a computer cursor in a graphical-user-interface task, Revechkis B, Aflalo TN, Kellis S, Pouratian N, Andersen RA. J Neural Eng. 2014 Dec;11(6):066014. doi: 10.1088/1741-2560/11/6/066014. Epub 2014 Nov 14. Erratum in: J Neural Eng. 2015 Feb;12(1):019601. Full text

Hand Shape Representations in the Human Posterior Parietal Cortex, Klaes C, Kellis S, Aflalo T, Lee B, Pejsa K, Shanfield K, Hayes-Jackson S, Aisen M, Heck C, Liu C, Andersen RA. J Neurosci. 2015 Nov 18;35(46):15466-76. doi: 10.1523/JNEUROSCI.2747-15.2015. Full text

Prediction of imagined single-joint movements in a person with high-level tetraplegia, Ajiboye AB, Simeral JD, Donoghue JP, Hochberg LR, Kirsch RF. IEEE Trans Biomed Eng. 2012 Oct;59(10):2755-65. Epub 2012 Jul 23. Full text

Cognitive control signals for neural prosthetics, Musallam S, Corneil BD, Greger B, Scherberger H, Andersen RA. Science. 2004 Jul 9;305(5681):258-62.

Paraplegic Man Drinks a Beer Using a Mind-Controlled Prosthetic Arm. Erik G. Sorto has been paralyzed from the neck down for about 12 years. He is now able to command a robot arm using brain implants that read his intentions. Brain Decoder. Credit: Caltech team. Youtube May 21, 2015
Mobility Menu
   403-240-9100

follow us in feedly

Call 403-240-9100