EPFL’s FlyJacket Allows You to Pilot a Drone Using Your Body

Usually, drones are controlled via several methods — using a handheld controller, a touchscreen device or both. Switzerland research…

Cabe Atwell
6 years agoDrones / Virtual Reality

Usually, drones are controlled via several methods — using a handheld controller, a touchscreen device or both. Switzerland research institution EPFL (École polytechnique fédérale de Lausanne) took that control a step further by being able to pilot a drone using body gestures. EPFL researchers have designed a flight suit of sorts, known as the — an exosuit that uses motion and gestures to manipulate drones while in flight. Although it may seem counter-intuitive to fly a drone using gestures, throwing in a VR headset can help negate that feeling and provide a new level of flight emersion at the same time.

Controlling the drone is done through body movements — spreading out the arms like wings and turning, pitching or rolling their body, the drone responds in kind, much like a bird in flight. At the same time, pilots can view the drone’s flight path in real-time while performing maneuvers flying fixed-wing drones. EPFL states the suit is soft and pliable, so moving, bending and arm rotation isn’t an issue or encumbering and it even features arm supports to help alleviate fatigue.

When it comes to the suit’s tech specifics — the FlyJacket’s skin features an elastic fabric layer composed of ‘sports fabric’ (AKA breathable polyester mesh), complete with leather shoulder, elbow and torso joints- offering the user a full range of motion without impedance. The arm supports are linear-passive gas springs mounted to 3D-printed support plates both on the arms and torso, providing the wearer with support for prolonged flight engagements.

Tracking the body’s movements to translate into flight commands is done through motion-tracking components, specifically IMU (Inertial Measurement Unit) sensors positioned throughout the suit. Those movements are processed by an onboard (or rather a wearable) STMicroelectronics STM32F100 MCU housed in the torso of the suit. The tracked and processed movements are then relayed to the drone via a transmission unit that sends those movement signals to the drone in real-time. To provide increased immersion (and incredibly fun), the suit is outfitted with haptic feedback modules that let the user ‘feel’ the flight while performing maneuvers.

According to the research paper released on the FlyJacket:

“The development of more intuitive control interfaces could improve flight efficiency, reduce errors, and allow users to shift their attention from the task of control to the evaluation of the information provided by the drone. Human-robot interfaces could be improved by focusing on natural human gestures captured by wearable sensors. Indeed, the use of wearable devices, such as exoskeletons, has been shown to enhance control intuitiveness and immersion.”

The team tested the FlyJacket using a Parrot Bebop 2 quadcopter along with an included smart glove that offers increased functionality through finger gestures. For example, touching the middle finger to the thumb triggers the system to set a point of interest that’s GPS marked so users can come back to that location if need be. As it stands, the researchers are looking to improve their FlyJacket for additional functionality, including ways to adjust drone speeds for precision flying.

UPDATE: EPFLhas shared a new video with some more detail on the jacket itself.

Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles