Prosthetic Hand Uses AI for Complex Motions Like in Rock, Paper, Scissors

It was only a few decades ago that prosthetic limbs weren’t much more than rigid lumps of plastic that were designed almost entirely for…

It was only a few decades ago that prosthetic limbs weren’t much more than rigid lumps of plastic that were designed almost entirely for aesthetic purposes. Now we’re capable of building complex robotic prosthetics, and can do so more affordably than ever. Robotic hands, for example, can have dexterity that approaches that of a human hand. The problem is that they’re difficult for people to control. That’s why researchers from the Biological Systems Engineering Lab at Hiroshima University in Japan are turning to AI to help improve the practicality of robotic prosthetics.

Imagine that you’ve lost your hand in a terrible accident. Then you’re giving the most advanced robotic prosthetic ever devised, which is capable of human-like motion. How do you actually control those motions? That is a question that has been plaguing roboticists and medical professionals. The robotic prosthetic may be perfect, but it’s useless if the wearer can’t control it. Fortunately, your muscles still produce electrical activity in your forearm when you try to move your fingers — even if your hand is gone. Those electrical signals can be used to control the prosthetic, we just need a way to interpret them reliably.

To do that, these researchers created a neural network called Cybernetic Interface. The neural network is first trained on those electrical signals that it picks up using electrodes placed on the skin. During training, the user might say “I’m attempting to move my index finger,” and over time the neural network will learn which electrical signals correlate to specific kinds of finger movement. After training, it can monitor those electrical signals and infer what the user is trying to do with their robotic prosthetic.

To demonstrate that capability, the researchers had test subjects play a game of rock, paper, scissors. In testing, their 3D-printed robotic hand was able to interpret the users’ electrical signals in just five milliseconds and perform the correct gesture 95 percent of the time. The system can then combine those basic motions to perform more complicated actions, such as picking up a water bottle. This system does require that the forearm be intact, but could dramatically improve the usability of robotic prosthetics for users who fit that requirement.

Cameron Coward
Writer for Hackster News. Proud husband and dog dad. Maker and serial hobbyist. Check out my YouTube channel: Serial Hobbyism
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles