Imagine changing the channel of your TV simply by moving your cup of tea, adjusting the volume on a music player by rolling a toy car, or rotating a spatula to pause a cookery video on your tablet.
New gesture control technology that can turn everyday objects into remote controls could revolutionise how we interact with televisions, and other screens – ending frustrating searches for remotes that have slipped down the side of sofa cushions.
In a paper – ‘Matchpoint: Spontaneous spatial coupling of body movement for touchless pointing’ – which will be presented at the UIST2017 conference in Quebec City this October, researchers from Lancaster University show a novel technique that allows body movement, or movement of objects, to be used to interact with screens.
The ‘Matchpoint’ technology, which only requires a simple webcam, works by displaying moving targets that orbit a small circular widget in the corner of the screen. These targets correspond to different functions – such as volume, changing channel or viewing a menu. The user synchronises the direction of movement of the target, with their hand, head or an object, to achieve what researchers call ‘spontaneous spatial coupling’, which activates the desired function.
Unlike existing gesture control technology, the software does not look for a specific body part it has been trained to identify – such as a hand. Lancaster’s technology looks for rotating movement so it doesn’t require calibration, or the software to have prior knowledge of objects. This provides much more flexibility and ease for the user as it works even while hands are full, and while stood or slouching on the sofa.
Users also do not need to learn specific commands to activate different functions, as is the case with some gesture controlled televisions on the market, and the user is able to decouple at will.
When selecting volume adjustment or channel selection, sliders appear. The user moves their hand, head, or object, in the required direction indicated by the slider to change the volume or to find the desired channel.
As well as televisions, the technology can also be used with other screens. For example, YouTube tutorials, such as mending bikes or baking cakes, could be easily paused and rewound on tablet computers without users having to put down tools or mixing bowls.
Multiple pointers can be created to allow more than one user to point at drawings or pictures on interactive whiteboards simultaneously. Matchpoint also allows users to manipulate images on whiteboards by using two hands to zoom in and out, and rotate images.
In addition to short-term couplings, users can also link stationary objects to controls, which even when left for prolonged periods will retain their control function. For example, a mug sat on a table could change a track on a music player when moved left or right, and a rolling toy car could be used to adjust volume. Objects can lose their coupling with controls simply by removing them from the camera’s field of view.
Christopher Clarke, PhD student at Lancaster University’s School of Computing and Communications, and developer of the technology, said: “Spontaneous spatial coupling is a new approach to gesture control that works by matching movement instead of asking the computer to recognise a specific object.
“Our method allows for a much more user-friendly experience where you can change channels without having to put down your drink, or change your position, whether that is relaxing on the sofa or standing in the kitchen following a recipe.
“Everyday objects in the house can now easily become remote controls so there are no more frantic searches for remote controls when your favourite programme is about to start on another channel, and now everyone in the room has the ‘remote’. You could even change the channel with your pet cat.”
Researchers believe Matchpoint is also suitable to be used as an accessibility tool for people who are unable to use traditional pointers, such as remote controls and a mouse and keyboard.
The Latest on: Gesture control
- Pakistani diesel motorcycle that can park itself, sense gestures and play music!on October 13, 2019 at 11:08 pm
This confirms that his bike senses gestures and his actions will surely remind you of the gesture control feature that is offered with some cars. Mohammad Saeed can also be seen using voice commands ...
- How to use gestures and gesture-based navigation in your Android appon October 12, 2019 at 11:03 pm
Touch slop is the distance, in pixels, that a pointer can travel before a non-movement based gesture When using movement-based gestures, you need to ensure that the user is in control of any onscreen ...
- iPhone touch gestures and commands—no Home button, no problem!on October 10, 2019 at 3:06 am
But on modern iPhones with a camera notch and no Home button, there are two “swipe from the top of the screen” gestures; swipe from the left side of the sensor notch to get to your notifications.
- Tap Strap 2 gesture controller launches for $199on October 9, 2019 at 2:03 pm
The gesture control device is fitted with a rechargeable battery capable of providing up to 10 hours of use, together with a haptic feedback unit. Check out the demonstration video below to learn more ...
- If you want people to use your navigation gestures, Google, make them betteron October 9, 2019 at 12:39 pm
Related: Android 10 review: The most personal Android yet Meanwhile, Apple’s ethos when it comes to smartphones has always been the so-called “walled garden” approach: give the user a beautiful and ...
- How Google is taking control of Gesture Navigation in Android 10on October 8, 2019 at 1:47 pm
After experimenting with button-based gesture controls in Android 9 Pie, Google went back to the drawing board to improve the fluidity and one-handed use of Android’s gesture navigation. With Android ...
- Tap Strap 2 Launches, Allowing Users To Control Devices With Gestureson October 8, 2019 at 10:00 am
LOS ANGELES, Oct. 8, 2019 /PRNewswire/ -- Tap, the wearable keyboard and mouse, has released an all-new Tap Strap 2 which introduces the AirMouse feature, enabling users to control devices such as ...
- Google orders OEM to hide custom gestures in favour of Android stockon October 8, 2019 at 7:20 am
Thing is, being open-source, OEMs quite often jump the gun and create their own features before Google puts them officially into Android. That means a lot of phones already have their own gesture ...
- Tap Strap 2 adds gesture control to any Bluetooth-enabled deviceon October 8, 2019 at 6:07 am
It comes with three modes of operation. First up is Mouse Mode, where you can control a cursor, click and scroll just like you would with a normal mouse (great for taking selfies). Then there's ...
- Google Now Requires OEMs To Hide Their Custom Android 10 Gesture Controlson October 8, 2019 at 4:55 am
While the updated GMS allows them to be installed on the device, OEMs are forced to hide their gesture control system and they can't be advertised in the "setup wizard or any other method." That means ...
via Google News and Bing News