For less than $100, team builds glove to translate sign language into text

13 Jul 201719 Shares

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Making the letter ‘L’ in American Sign Language. Image: ANOCHA KOLUANG/Shutterstock

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Those who use sign language will soon be able to translate their words into text with a new, affordable smart glove.

Despite many attempts to use technology to give sign language users greater power to communicate over the years, there remains few – if any – affordable tools to turn movement into text.

Now, however, a team from the University of California, San Diego has published a study detailing their new device that costs less than $100, but can translate American Sign Language (ASL) into text and then transmit it to an electronic device.

Existing tools have proven successful – such as tracking software in cameras and infrared optical emitters and receivers – but the sheer amount of power, weight and cost of these technologies makes them unfeasible.

To make this latest breakthrough, the team of Timothy O’Connor and Darren Lipomi decided to use nine flexible strain sensors – including two on each finger and one on the thumb – that detect knuckle articulation.

The microprocessor on board the smart glove can then translate these movements into words and transmit them to a smartphone or other device over Bluetooth.

Sign language glove

Overview of the gesture-decoding glove. Image: Timothy O’Connor et al (2017)

Uses in virtual reality?

What makes this a possible major breakthrough is that it is not only cheap, but the system has been designed to consume little power.

The researchers found that the glove was able to determine all 26 letters of the ASL alphabet accurately and, based on fatigue studies of the sensors, the system will continue to translate the letters correctly, even after the knuckles are moved to their maximum of 1,000 times.

Even more intriguing to O’Connor and Lipomi is that data from the glove could also generate an accurate virtual display by mimicking a hand, suggesting new ways of using flexible, wearable electronics to interface in virtual reality.

While still in the early stages of development, it is hoped that the glove can be a test system for evaluating the performance of new materials and stretchable hybrid electronics.

Their findings are published in the open-access journal, PLOS One.

Colm Gorey is a journalist with Siliconrepublic.com

editorial@siliconrepublic.com