Man feeds himself with robotic arms

Robotic arms connected directly to brain of partially paralysed man allows him to feed himself.

Recent advances in neural science, robotics, and software have enabled scientists to develop a robotic system that responds to muscle movement signals from a partially paralysed person relayed through a brain-machine interface.

Two robotic arms – a fork in one hand, a knife in the other – flank a seated man, who sits in front of a table, with a piece of cake on a plate. A computerised voice announces each action: “moving fork to food” and “retracting knife.” Partially paralysed, the man makes subtle motions with his right and left fists at certain prompts, such as “select cut location”, so that the machine slices off a bite-sized piece. Now: “moving food to mouth” and another subtle gesture to align the fork with his mouth.

In less than 90 seconds, a person with very limited upper body mobility who hasn’t been able to use his fingers in about 30 years, just fed himself dessert using his mind and some smart robotic hands.

A team led by researchers at the Johns Hopkins Applied Physics Laboratory (APL), in Laurel, Maryland, and the Department of Physical Medicine and Rehabilitation (PMR) in the Johns Hopkins School of Medicine, published a paper in the journal Frontiers in Neurorobotics that described this latest feat using a brain-machine interface (BMI) and a pair of modular prosthetic limbs.

Also sometimes referred to as a brain-computer interface, BMI systems provide a direct communication link between the brain and a computer, which decodes neural signals and ‘translates’ them to perform various external functions, from moving a cursor on a screen to now enjoying a bite of cake. In this particular experiment, muscle movement signals from the brain helped control the robotic prosthetics.

The study built on more than 15 years of research in neural science, robotics, and software, led by APL in collaboration with the Department of PMR, as part of the Revolutionizing Prosthetics program, which was originally sponsored by the US Defense Advanced Research Project Agency (DARPA). The new paper outlines an innovative model for shared control that enables a human to manoeuvre a pair of robotic prostheses with minimal mental input.

“This shared control approach is intended to leverage the intrinsic capabilities of the brain machine interface and the robotic system, creating a ‘best of both worlds’ environment where the user can personalise the behaviour of a smart prosthesis,” said Dr Francesco Tenore, a senior project manager in APL’s Research and Exploratory Development Department. The paper’s senior author, Tenore focuses on neural interface and applied neuroscience research.

Although our results are preliminary, we are excited about giving users with limited capability a true sense of control over increasingly intelligent assistive machines

Dr Francesco Tenore, a senior project manager in APL’s Research and Exploratory Development Department