Imagine changing the channel of your TV simply by moving your cup of tea, adjusting the volume on a music player by rolling a toy car, or rotating a spatula to pause a cookery video on your tablet.
New gesture control technology that can turn everyday objects into remote controls could revolutionise how we interact with televisions, and other screens – ending frustrating searches for remotes that have slipped down the side of sofa cushions.
In a paper – ‘Matchpoint: Spontaneous spatial coupling of body movement for touchless pointing’ – which will be presented at the UIST2017 conference in Quebec City this October, researchers from Lancaster University show a novel technique that allows body movement, or movement of objects, to be used to interact with screens.
The ‘Matchpoint’ technology, which only requires a simple webcam, works by displaying moving targets that orbit a small circular widget in the corner of the screen. These targets correspond to different functions – such as volume, changing channel or viewing a menu. The user synchronises the direction of movement of the target, with their hand, head or an object, to achieve what researchers call ‘spontaneous spatial coupling’, which activates the desired function.
Unlike existing gesture control technology, the software does not look for a specific body part it has been trained to identify – such as a hand. Lancaster’s technology looks for rotating movement so it doesn’t require calibration, or the software to have prior knowledge of objects. This provides much more flexibility and ease for the user as it works even while hands are full, and while stood or slouching on the sofa.
Users also do not need to learn specific commands to activate different functions, as is the case with some gesture controlled televisions on the market, and the user is able to decouple at will.
When selecting volume adjustment or channel selection, sliders appear. The user moves their hand, head, or object, in the required direction indicated by the slider to change the volume or to find the desired channel.
As well as televisions, the technology can also be used with other screens. For example, YouTube tutorials, such as mending bikes or baking cakes, could be easily paused and rewound on tablet computers without users having to put down tools or mixing bowls.
Multiple pointers can be created to allow more than one user to point at drawings or pictures on interactive whiteboards simultaneously. Matchpoint also allows users to manipulate images on whiteboards by using two hands to zoom in and out, and rotate images.
In addition to short-term couplings, users can also link stationary objects to controls, which even when left for prolonged periods will retain their control function. For example, a mug sat on a table could change a track on a music player when moved left or right, and a rolling toy car could be used to adjust volume. Objects can lose their coupling with controls simply by removing them from the camera’s field of view.
Christopher Clarke, PhD student at Lancaster University’s School of Computing and Communications, and developer of the technology, said: “Spontaneous spatial coupling is a new approach to gesture control that works by matching movement instead of asking the computer to recognise a specific object.
“Our method allows for a much more user-friendly experience where you can change channels without having to put down your drink, or change your position, whether that is relaxing on the sofa or standing in the kitchen following a recipe.
“Everyday objects in the house can now easily become remote controls so there are no more frantic searches for remote controls when your favourite programme is about to start on another channel, and now everyone in the room has the ‘remote’. You could even change the channel with your pet cat.”
Researchers believe Matchpoint is also suitable to be used as an accessibility tool for people who are unable to use traditional pointers, such as remote controls and a mouse and keyboard.
The Latest on: Gesture control
- 3D gesture HAT uses Microchip E-field technologyon July 3, 2019 at 8:39 am
Applications include home automation, game and A/V control, and alternative laptop input. 3D Gesture & Tracking Shield showing sensor area (left) and flipside with kit contents (click images to ... […]
- VVCE students develop ‘Automated Wheel-chair with Gesture Control’on July 1, 2019 at 7:57 am
Mysore: People suffering with physical disability due to illness, accidents or any other disability are in need of wheelchair. Patients who are suffering from low or medium level disability are able ... […]
- Boris Johnson ready to 'regain control' of leadership race after row with Carrie Symondson June 25, 2019 at 5:09 pm
The former Foreign Secretary looked “anxious” but ready to “regain control” after a row with his girlfriend ... "So they are Boris’s characteristic gestures - the ‘royal’ wave, the double thumbs up ... […]
- SOF looks to Gesture Control Technologyon June 13, 2019 at 1:55 pm
US special operations forces could soon benefit from Gesture Control Technology (GCT), allowing personnel to interact with connected platforms, devices and sensors with simple hand and finger ... […]
- Google’s Pixel 4 Could Include Air Gestureson June 12, 2019 at 5:16 am
According to a new report from 9to5Google, Google is reportedly planning to integrate a chip into the Pixel 4 devices that will allow users to control the device with air gestures. So you will ... […]
- Google's Pixel 4 could track hand gestureson June 12, 2019 at 3:54 am
It's a leap, but these sites believe that the gestures could be for media control in a future device. Google has shown off the power of Soli before, and at I/O demonstrated how it could be used to ... […]
- India engineers bring gestures to Mercedeson June 12, 2019 at 12:27 am
The seat massage function can be activated with a gesture, and the system will know ... the cameras and radars we have for safety, for cruise control. We are now trying to make the interior ... […]
- Google’s Project Soli Could Debut in Pixel 4, Gesture Control for Media Playbackon June 11, 2019 at 3:47 pm
Back at Google I/O in 2016, Google highlighted Project Soli, a gesture-based system that used radar antennas to read hand movements in front of equipped devices. At the time, Google showed it working ... […]
- Google is working on new gestures that require an “Aware” sensor, possibly for the Pixel 4on June 11, 2019 at 2:39 pm
However, if Google is integrating a new chip on the Pixel 4 for touchless gestures, then it’s possible that the new Pixel will let you control music playback without needing to talk to or touch your ... […]
- Apple Wins a Major Hover Control Patent that extends to Apple Pencil on iPad Pro and In-Air Gestureson June 10, 2019 at 5:00 pm
Apple has been working to allow users to control their devices without ever touching the display with 'Hover' capabilities going back to 2008. Apple's patent filing turned into a granted patent in ... […]
via Google News and Bing News