CQT researchers and their collaborator present a quantum speed-up for machine learning
One of the ways that computers ‘think’ is by analysing relationships within large sets of data. CQT’s Jansen (Zhikuan) Zhao, Anupam Prakash and their collaborator have shown that quantum computers can do one such analysis faster than classical computers, for a wider array of data types than was previously expected.
The team’s proposed ‘quantum linear system algorithm’ is published in the 2 February issue of Physical Review Letters. In the future, it could help crunch numbers on problems as varied as commodities pricing, social networks and chemical structures.
“The previous quantum algorithm of this kind applied to a very specific type of problem. We need an upgrade if we want to achieve a quantum speed up for other data,” says Jansen, who is corresponding author on the work.
That’s exactly what the team is offering. The CQT researchers began collaborating with Leonard Wossnig when he visited the Centre. He was then a Master’s Student at ETH Zurich. Jansen is a PhD student, and Anupam is a research fellow. Jansen’s PhD is with the Singapore University of Technology and Design.
The first quantum linear system algorithm was proposed in 2009 by a different group of researchers. That algorithm kick-started research into quantum forms of machine learning, or artificial intelligence.
A linear system algorithm works on a large matrix of data. For example, a trader might be trying to predict the future price of goods. The matrix may capture historical data about price movements over time and data about features that could be influencing these prices, such as currency exchange rates. The algorithm calculates how strongly each feature is correlated with another by ‘inverting’ the matrix. This information can then be used to extrapolate into the future.
“There is a lot of computation involved in analysing the matrix. When it gets beyond say 10,000 by 10,000 entries, it becomes hard for classical computers,” explains Jansen. This is because the number of computational steps goes up rapidly with the number of elements in the matrix: every doubling of the matrix size increases the length of the calculation eight-fold.
The 2009 algorithm could cope better with bigger matrices, but only if the data in them is what’s known as ‘sparse’. In these cases, there are limited relationships among the elements, which is often not true of real-world data.
Jansen, Anupam and Leonard present a new algorithm that is faster than both the classical and the previous quantum versions, without restrictions on the kind of data it works for.
As a rough guide, for a 10,000 square matrix, the classical algorithm would take on the order of a trillion computational steps, the first quantum algorithm some 10,000s of steps and the new quantum algorithm just 100s of steps. The algorithm relies on a technique known as quantum singular value estimation.
There have been a few proof-of-principle demonstrations of the earlier quantum linear system algorithm on small-scale quantum computers. Jansen and his colleagues hope to work with an experimental group to run a proof-of-principle demonstration of their algorithm, too. They also want to do a full analysis of the effort required to implement the algorithm, checking what overhead costs there may be.
To show a real quantum advantage over the classical algorithms will need bigger quantum computers. Jansen estimates that “We’re maybe looking at three to five years in the future when we can actually use the hardware built by the experimentalists to do meaningful quantum computation with application in artificial intelligence.”
Learn more: Quantum algorithm could help AI think faster
The Latest on: Machine learning
via Google News
The Latest on: Machine learning
- The Humble Tire Gets Kitted Out With Technologyon March 19, 2020 at 7:41 pm
Goodyear is developing a tire outfitted with a sensor and proprietary machine-learning algorithms. The hope is that the tires will help self-driving cars brake at a shorter distance and communicate ...
- Wearable Sensor for Worker Safetyon March 17, 2020 at 10:50 am
Data collected by each Fuse device uploads to the cloud, where the Fuse platform’s proprietary machine learning algorithm analyzes the wearer’s risk of musculoskeletal injuries. Explaining the sensor ...
- Scientists Use Machine Learning to Develop Tactile, Vision-Based Sensoron March 17, 2020 at 9:19 am
Scientists at ETH Zürich have recently built an innovative yet affordable tactile sensor using machine learning. This sensor quantifies the distribution of force at high resolution and with excellent ...
- Eyeware and Melexis collaborate on 3D ToF sensor-based eye-tracking solutions for Driver Monitoring Systemson March 17, 2020 at 8:25 am
The company has developed its algorithms using proprietary strategies that are based on data-driven machine learning approaches, making it applicable in systems using ... “Melexis’ MLX75027 3D time-of ...
- Arduino Precision 9 Degree of Freedom (9-DoF) sensoron March 16, 2020 at 6:02 pm
The small sensor has been designed to provide high quality motion direction and orientation ... There’s also some nice extras, such as built in tap detection, activity detection, pedometer/step ...
- Sensor skin could give robot grippers a delicate touchon March 16, 2020 at 4:28 pm
The researchers set up experiments to precisely control contact with the sensor, including varying the size of the object, its location on the sensor skin and the force applied. Machine learning was ...
- ON Semiconductor receives IoT star award for the Most Influential IoT Sensor Company of 2019on March 16, 2020 at 1:42 pm
Image sensors are becoming increasingly important, as the use of AI and Machine Learning enables devices to perceive the world as we see it ... Image sensors are fundamental to realizing that and ON ...
- Researchers combine low-cost tactile sensor with machine learning to develop robots that feelon March 16, 2020 at 5:33 am
Researchers from ETH Zürich have announced that they have leveraged machine learning to develop a low-cost tactile sensor. The sensor can measure force distribution in high resolution and with high ...
- Deep learning continues growth in machine visionon March 13, 2020 at 2:52 pm
they can be used to execute more targeted deep learning in machine vision applications,” he says. One such example exists in the November/December 2019 article, Automated system inspects radioactive ...
via Bing News