This Brain-Machine Interface Can Control Bionic Prosthetics

People end up with motor functionality impairments for many different reasons. The culprit could be anything from a limb lost to injury, to…

People end up with motor functionality impairments for many different reasons. The culprit could be anything from a limb lost to injury, to neuromuscular disease. Fortunately, in many of these cases the motor cortex is unharmed and can still function. With their groundbreaking brain-machine interface, IBM Research-Australia is developing a system that utilizes the intact motor cortex to provide control of robotic prosthetics.

That’s possible because, physiologically speaking, the motor cortex is still trying to direct the body — even if the limb it’s trying to direct is missing. Like a stereo with the speakers disconnected, the output is still occurring but the connection is broken. The brain-machine interface that they created reads that output from the motor cortex with a commercially-available OpenBCI EEG (electroencephalogram) headset. Those brains waves are then processed to determine the intended action of the user, and it then directs a robotic arm to execute that action.

In the past, determining intent from EEG readings was difficult. The major breakthrough here is the use of a deep-learning AI that can decode those action intentions. Once it does, it uses a custom robotic control framework called GraspNet to follow the desired actions. That runs on an NVIDIA Jetson TX1 single-board computer, which keeps costs low. By exclusively using commercially-available parts in the system, IBM Research-Australia has come up with a solution that could potentially be affordable enough to be viable.

Cameron Coward
Writer for Hackster News. Proud husband and dog dad. Maker and serial hobbyist. Check out my YouTube channel: Serial Hobbyism
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles