Professor Otmar Hilliges and his staff at ETH Zurich have developed a new app enabling users to operate their smartphone with gestures. This development expands the range of potential interactions with such devices.
It does seem slightly odd at first: you hold the phone in one hand, and move the other in the air above its built-in camera making gestures that resemble sign language. Sometimes you move your index finger to the left, sometimes to the right. You can spread out your fingers, or imitate a pair of pliers or the firing of a pistol. These gestures are not, however, intended for communicating with deaf people; they are for controlling your smartphone.
By mimicking the firing of a pistol, for example, a user can switch to another browser tab, change the map’s view from satellite to standard, or shoot down enemy planes in a game. Spreading out your fingers magnifies a section of a map or scrolls the page of a book forwards.
All this gesturing wizardry is made possible by a new type of algorithm developed by Jie Song, a Master’s student in the working group headed by by Otmar Hilliges, Professor of Computer Science. The researchers presented the app to an audience of industry professionals at the UIST symposium in Honolulu, Hawaii.
Intelligent programming uses computer memory
The program uses the smartphone’s built-in camera to register its environment. It does not evaluate depth or colour. The information it does register – the shape of the gesture, the parts of the hand – is reduced to a simple outline that is classified according to stored gestures. The program then executes the command associated with the gesture it observes. The program also recognises the hand’s distance from the camera and warns the user when the hand is either too close or too far away.
“Many movement-recognition programs need plenty of processor and memory power”, explains Hilliges, adding that their new algorithm uses a far smaller portion of computer memory and is thus ideal for smartphones. He believes the application is the first of its kind that can run on a smartphone. The app’s minimal processing footprint means it could also run on smart watches or in augmented-reality glasses.