This Artificial Intelligence Can See Touch and Feel Sights

None of your senses work alone, and your brain automatically weaves together information from all of your senses to better understand the…

None of your senses work alone, and your brain automatically weaves together information from all of your senses to better understand the world around you. By doing so, you quickly learn to find correlations that you can then use to form predictions. For instance, you might know that your frying pan is hot based on its smell, or that a tool is sharp because of how its edge looks. But robots and computers have a difficult time doing that, which is why researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed an artificial intelligence that can see touch and feel sights.

The team paired their artificial intelligence with a KUKA robot arm that is equipped with either a simple webcam or a special tactile sensor called GelSight. That allows the artificial intelligence to either see an object or touch it, but it doesn’t do both at the same time. They then built a training data set from almost 200 objects being touched more than 12,000 times. That data set includes both images of the specific part of the object being touch as well as the corresponding tactile data. The data set was then used to train the artificial intelligence’s generative adversarial network, which is able to work with relatively small data sets.

After training, the artificial intelligence was capable of performing in two distinct ways. First, if it was shown an object, it could predict what different parts of that object would feel like and determine the best way to pick it up. Second, if it was allowed to touch the object, it could guess what the object was, and which part of the object it was touching. Eventually, an artificial intelligence system like this could help robots interact with the world in more sophisticated ways. Instead of having to rely on only computer vision or explicit sensor readings, such a robot would be able to infer information about objects by using all of its senses.

Cameron Coward
Writer for Hackster News. Proud husband and dog dad. Maker and serial hobbyist. Check out my YouTube channel: Serial Hobbyism
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles