Kinect-like Touchless Gesture User Interface For iPad Will Be Demoed At CES

BY Whizkid

Published 31 Dec 2010

Mimesign

Norway based Elliptic Labs is working on a touchless gesture user interface for Apple’s iPad called Mimesign. The technology, which is similar to that seen on Microsoft’s Kinect will allow users to control various functions of the iPad – without actually having to touch the screen.

They already seem to have a prototype ready, which they would be demoing at the CES 2011 that kicks off early next month.

According to MobileMagazine, unlike the Kinect which uses cameras to detect your position and motion, Mimesign uses ultrasound technology to do (almost) the same thing. The ultrasound field that is generated one foot on either sides of a docked iPad allows users to simply wave their hands in front of the screen to interact with the menu on the iPad screen.

Stian Aldrin, the CEO of Elliptic Labs explains:

“The idea is that you use touch-less gestures to operate primary functions of a docked tablet in situations like when you have wet or greasy hands in the kitchen. In general tablets are made for being hand held. When it is docked you are often walking or standing further away, and then using a finger on the screen involves a change of modality. Rather than bending down, leaning forward or picking it up you can use larger movements a little bit further away to do things like volume up or next song without changing modality.”

You can also checkout the demo video of the prototype system in action below:

What’s your take on Mimesign? Do you fancy using touchless gesture user interface? Please share your views in the comments section below.

[Via Mobile MagElliptic Labs]