Elliptic Labs brings touchless gesturing to smartphones using ultrasound technology

1 Oct 2013

Still from 'Elliptic Labs Promo' by Elliptic Laboratories AS on Vimeo

Visitors to the CEATEC advanced technologies trade show in Japan this week will get the opportunity to see Android smartphones controlled using ultrasound-based touchless gesturing from Elliptic Labs, and developers can now get involved with the release of the SDK.

A pioneer in ultrasonic touchless gesturing, Elliptic Labs started by bringing this form of interface manipulation to Windows 8 computers and laptops last year. Now, the company has launched its first SDK for touchless gesturing on Android smartphones.

Unlike other touchless gesture technology that relies on in-built cameras, Elliptic Labs’ technology uses an ultrasound sensor that responds to soundwave disruptions caused by hand movements.

Because it uses sound and not sight, the sensor is capable of recognising gestures from a complete 180-degree field surrounding the device display. It also consumes less power than an optical-led device and works just as well in the dark.

 

Smartphones fitted with tiny microphones, transducers and Elliptic Labs sofware will be demonstrated at CEATEC in Tokyo this week, while the SDK is available from 2 October.

This will allow Android smartphone manufacturers to make use of the ultrasound specturm and enable touchless gesturing on their devices easily and cost-effectively, said Elliptic Labs CEO Laila Danielsen. “Our technology is also great for playing games on smartphones,” she added. “It uses little power and with our high resolution, you will be able to play popular games such as Fruit Ninja, Subway Surfers or any other games that require high relative accuracy and speed.”

Elaine Burke is the host of For Tech’s Sake, a co-production from Silicon Republic and The HeadStuff Podcast Network. She was previously the editor of Silicon Republic.

editorial@siliconrepublic.com