Using Ultrasound Imaging for Gesture Control with EchoFlex

Most gesture control systems we come across work by either physically measuring the movement of the hand (like with sensors in a glove), or by tracking that movement with cameras and computer vision. Both have fairly obvious drawbacks: wearing sensors on your hand is uncomfortable and conspicuous, and computer vision tracking requires that a camera be mounted somewhere. But, a new proposed technology called EchoFlex might make all of that unnecessary.

New research has shown future wearable devices, such as smartwatches, could use ultrasound imaging to sense hand gestures. (📷: Bristol Interaction Group)

EchoFlex, which is mostly just a concept at this point, would use ultrasound imaging to detect the movement of muscles and tendons in your arm. This would, theoretically, allow gestures to be registered with just a band worn on your arm (like a smartwatch). As the research team from the University of Bristol’s Bristol Interaction Group demonstrates, muscle and tendon movement can clearly be seen with ultrasound imaging.

Most of your hand movement comes from tendons connected to muscles in your arm, not muscles in your hand. So, when you move your fingers to make unique gestures, they can be seen without actually looking at your hand. For most people, this tech would be nice because of how unobtrusive it is, but it also has particular potential for people who wear prosthetics.

Prosthetic hand contorl is just one of EchoFlex’s many envisioned applications. (📷: Bristol Interaction Group)

If, for instance, a person requires a prosthetic hand, then EchoFlex could be used to control it. The system could detect a person trying to move a missing finger, and move the corresponding prosthetic. It’ll be a long time before we actually see something like EchoFlex on a consumer level, as there are a number of hurdles to overcome to overcome — just making an ultrasound imager small enough is a major challenge. But, it’s definitely a promising possibility.