Imagine changing the channel of your TV simply by moving your cup of tea, adjusting the volume on a music player by rolling a toy car, or rotating a spatula to pause a cookery video on your tablet.
New gesture control technology that can turn everyday objects into remote controls could revolutionise how we interact with televisions, and other screens – ending frustrating searches for remotes that have slipped down the side of sofa cushions.
In a paper – ‘Matchpoint: Spontaneous spatial coupling of body movement for touchless pointing’ – which will be presented at the UIST2017 conference in Quebec City this October, researchers from Lancaster University show a novel technique that allows body movement, or movement of objects, to be used to interact with screens.
The ‘Matchpoint’ technology, which only requires a simple webcam, works by displaying moving targets that orbit a small circular widget in the corner of the screen. These targets correspond to different functions – such as volume, changing channel or viewing a menu. The user synchronises the direction of movement of the target, with their hand, head or an object, to achieve what researchers call ‘spontaneous spatial coupling’, which activates the desired function.
Unlike existing gesture control technology, the software does not look for a specific body part it has been trained to identify – such as a hand. Lancaster’s technology looks for rotating movement so it doesn’t require calibration, or the software to have prior knowledge of objects. This provides much more flexibility and ease for the user as it works even while hands are full, and while stood or slouching on the sofa.
Users also do not need to learn specific commands to activate different functions, as is the case with some gesture controlled televisions on the market, and the user is able to decouple at will.
When selecting volume adjustment or channel selection, sliders appear. The user moves their hand, head, or object, in the required direction indicated by the slider to change the volume or to find the desired channel.
As well as televisions, the technology can also be used with other screens. For example, YouTube tutorials, such as mending bikes or baking cakes, could be easily paused and rewound on tablet computers without users having to put down tools or mixing bowls.
Multiple pointers can be created to allow more than one user to point at drawings or pictures on interactive whiteboards simultaneously. Matchpoint also allows users to manipulate images on whiteboards by using two hands to zoom in and out, and rotate images.
In addition to short-term couplings, users can also link stationary objects to controls, which even when left for prolonged periods will retain their control function. For example, a mug sat on a table could change a track on a music player when moved left or right, and a rolling toy car could be used to adjust volume. Objects can lose their coupling with controls simply by removing them from the camera’s field of view.
Christopher Clarke, PhD student at Lancaster University’s School of Computing and Communications, and developer of the technology, said: “Spontaneous spatial coupling is a new approach to gesture control that works by matching movement instead of asking the computer to recognise a specific object.
“Our method allows for a much more user-friendly experience where you can change channels without having to put down your drink, or change your position, whether that is relaxing on the sofa or standing in the kitchen following a recipe.
“Everyday objects in the house can now easily become remote controls so there are no more frantic searches for remote controls when your favourite programme is about to start on another channel, and now everyone in the room has the ‘remote’. You could even change the channel with your pet cat.”
Researchers believe Matchpoint is also suitable to be used as an accessibility tool for people who are unable to use traditional pointers, such as remote controls and a mouse and keyboard.
The Latest on: Gesture control
- AR Smart Glass Maker MAD Gaze Snags $19 Million Investmenton February 24, 2020 at 12:12 pm
Series A round investment from DNS Capital and Black30 Ventures. In 2018, the Chinese augmented reality (AR) smart glasses developer was named one of Fast Company’s 50 Chinese technology companies to ...
- Gesture Control The Easy Wayon February 22, 2020 at 10:06 pm
Gesture control is a technology that has floated around for quite a while, but never quite reached mainstream acceptance. Wii Bowling was fun for a while, but we’re not regularly using gestures ...
- Future Android 11 Phones could include new Backside Tapping Gestures to call up the camera, Google Assistant & moreon February 21, 2020 at 5:01 pm
Patently Apple posted a granted patent report back in May 2013 that revealed Apple was working on an iPhone with an invisible backside slider control. Then in March 2018 we posted a patent application ...
- Google Testing New Double-Tap Gesture on Back of Pixel Phones for Launching Camera, Assistant, and Moreon February 21, 2020 at 5:38 am
Google is reportedly testing an interesting new gesture for its Pixel smartphones that lets users double-tap the back of the handset to control various functions. XDA-Developers discovered the new ...
- Google testing double-tap gesture on rear of Pixel phoneson February 21, 2020 at 12:30 am
The system allows you to make a double-tap gesture on the back of a Pixel phone to control various functions. According to the outlet, the double-tap gesture allows you to dismiss timers, snooze ...
- Google Is Adding Rear Panel Tap Gestures For Pixels On Android 11on February 20, 2020 at 4:06 pm
Google is considering including new rear-panel double-tap gestures in Android 11 for its own Pixel handsets. That's according to a recent report from XDA Developers after tearing into the ...
- Hidden settings for back gesture sensitivity surface in Android 11, but they don't work yeton February 20, 2020 at 11:22 am
Notice a bug? Let us know here.
- Android 11 DP1: Google prepares more tweaks for Android’s gesture navigationon February 19, 2020 at 5:23 pm
Android 10 saw the debut of a proper, full gesture navigation system from Google. Over the course of several beta releases, Google tweaked the system and how users could control it and, now, in ...
- Android 11 adds a new Motion Sense gesture to pause music on the Pixel 4on February 19, 2020 at 11:38 am
When Google announced the Pixel 4, one of the main features Google showed off was Motion Sense. These are air gestures using Google’s proprietary Soli chip. It is basically a miniature radar chip ...
- Gesture Recognition Market Global Trends, Demand, Industry Verticals and Supply by Future Prediction 2020 to 2025on February 18, 2020 at 3:07 am
Global Gesture Recognition Market 2020-2025: The Global Gesture Recognition Market is expected to register a CAGR of over 27.9% during the forecast period, 2020 - 2025. The development of artificial ...
via Google News and Bing News