MIT Turns to Brainwaves and Hand Gestures to Control Robots

A team of engineers from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) is developing a method to correct robotic mistakes…

Cabe Atwell
6 years agoRobotics

A team of engineers from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) is developing a method to correct robotic mistakes using nothing but brainwaves and hand gestures. In other words, they are developing a more natural way to control robots as a way of teaching them to overcome their mistakes.

Their system monitors brain activity using an EEG headset, which monitors the wearers ‘error-related potentials’ (ErrPs), or the impulse that naturally occurs when you notice someone making a mistake. When a mistake signal is detected, the EEG interface stops the robot from continuing with its task so the user can correct it.

Correction is done utilizing an arm-strapped EMG that detects muscle activity and maps the user’s motions or gestures to correct the robot’s mistakes. The engineers stated that although both EMG and EEG have their shortcomings — EEG signals are not always detectable and EMG tend to have issues with mapping motion- merging the two provides a more robust platform and allows users to operate the system without training.

The team tasked Rethink Robotics Baxter for developing the system and demonstrated the interfaces abilities by having it move a power drill to one of three possible targets on the fuselage of a mock airplane. By itself, Baxter was able to hit the correct target it was assigned 70% of the time, implementing human help pushed the number up to 97%, a significant increase.

“By looking at both muscle and brain signals, we can start to pick up on a person’s natural gestures along with their snap decisions about whether something is going wrong. This helps make communicating with a robot more like communicating with another person.” — Project lead, Ph.D. candidate Joseph DelPreto

The engineers hope their system will one day become useful for the elderly and those with limited mobility or language disorders. The system seems to be a different approach to a human/machine interface, but instead of you having to adapt to the robot, it adapts to you. It will be interesting to see what can be done using MITs system if it ever becomes marketable.

Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles