- Scientists develop a system that responds to muscle movement signals through a brain-machine interface.
- The theory is that the addition of sensory feedback, delivered straight to a personโs brain, may help a person perform some tasks without requiring constant visual feedback in the current experiment.
- The project denotes the potential capabilities that can be developed to help people with disabilities but researchers say that more work needs to be done.
Recent advances in neural science, robotics and software have enabled scientists to develop a robotic system that responds to muscle movement signals from a partially paralysed person relayed through a brain-machine interface.
Humans and robots act as a team to make performing some tasks a piece of cake.
A team led by researchers at The Johns Hopkins Applied Physics Laboratory (APL), in Laurel, Maryland, and the Department of Physical Medicine and Rehabilitation (PMR) in the Johns Hopkins School of Medicine, published a paper in the journal Frontiers in Neurorobotics that described their latest feat using a brain-machine interface (BMI) and a pair of modular prosthetic limbs.
BMI systems provide a direct communication link between the brain and a computer, which decodes neural signals and โtranslatesโ them to perform various external functions, from moving a cursor on a screen to now enjoying a bite of cake. In this particular experiment, muscle movement signals from the brain helped control the robotic prosthetics.
Innovative model for shared control
The study was built on more than 15 years of research in neural science, robotics, and software, led by APL in collaboration with the Department of PMR, as part of the Revolutionizing Prosthetics program, which was originally sponsored by the US Defense Advanced Research Project Agency (DARPA).
The new paper outlines an innovative model for shared control that enables a human to manoeuvre a pair of robotic prostheses with minimal mental input.
In less than 90 seconds, a person with very limited upper body mobility who hasnโt been able to use his fingers in about 30 years just fed himself dessert using his mind and some smart robotic hands.
A computerised voice announces each action: โmoving the fork to foodโ and โretracting knife.โ Partially paralysed, the man makes subtle motions with his right and left fists at certain prompts, such as โselect cut locationโ, so that the machine slices off a bite-sized piece.
Now: โmoving food to mouthโ and another subtle gesture to align the fork with his mouth.
A true sense of control
โThis shared control approach is intended to leverage the intrinsic capabilities of the brain-machine interface and the robotic system, creating a โbest of both worldsโ environment where the user can personalise the behaviour of a smart prosthesis,โ Dr Francesco Tenore, a senior project manager in APLโs Research and Exploratory Development Department, said.
The paperโs senior author, Tenore focuses on the neural interface and applied neuroscience research.
โAlthough our results are preliminary, we are excited about giving users with limited capability a true sense of control over increasingly intelligent assistive machines,โ he added.
Dr David Handelman, the paperโs first author and a senior roboticist in the Intelligent Systems Branch of the Research and Exploratory Development Department at APL, said that one of the most important advances in robotics demonstrated in the paper is combining robot autonomy with limited human input, with the machine doing most of the work while enabling the user to customise robot behaviour to their liking.
โFor robots to perform human-like tasks for people with reduced functionality, they will require human-like dexterity. Human-like dexterity requires complex control of a complex robot skeleton,โ he explained.
โOur goal is to make it easy for the user to control the few things that matter most for specific tasks.โ
Exploring the potential of technology
Dr Pablo Celnik, project principal investigator in the department of PMR said: โThe human-machine interaction demonstrated in this project denotes the potential capabilities that can be developed to help people with disabilities.โ
While the DARPA program officially ended in August 2020, the team at APL and the Johns Hopkins School of Medicine continues to collaborate with colleagues at other institutions to demonstrate and explore the potential of the technology.
The next iteration of the system may integrate previous research that found providing sensory stimulation to amputees enabled them to not only perceive their phantom limb but use muscle movement signals from the brain to control a prosthetic.
The theory is that the addition of sensory feedback, delivered straight to a personโs brain, may help him or her perform some tasks without requiring constant visual feedback in the current experiment.
โThis research is a great example of this philosophy where we knew we had all the tools to demonstrate this complex bimanual activity of daily living that non-disabled people take for granted,โ Tenore said.
โMany challenges still lie ahead, including improved task execution, in terms of both accuracy and timing, and closed-loop control without the constant need for visual feedback.โ
Celnik said that future research will explore the boundaries of these interactions, even beyond basic activities of daily living.