CQT researchers and their collaborator present a quantum speed-up for machine learning
One of the ways that computers ‘think’ is by analysing relationships within large sets of data. CQT’s Jansen (Zhikuan) Zhao, Anupam Prakash and their collaborator have shown that quantum computers can do one such analysis faster than classical computers, for a wider array of data types than was previously expected.
The team’s proposed ‘quantum linear system algorithm’ is published in the 2 February issue of Physical Review Letters. In the future, it could help crunch numbers on problems as varied as commodities pricing, social networks and chemical structures.
“The previous quantum algorithm of this kind applied to a very specific type of problem. We need an upgrade if we want to achieve a quantum speed up for other data,” says Jansen, who is corresponding author on the work.
That’s exactly what the team is offering. The CQT researchers began collaborating with Leonard Wossnig when he visited the Centre. He was then a Master’s Student at ETH Zurich. Jansen is a PhD student, and Anupam is a research fellow. Jansen’s PhD is with the Singapore University of Technology and Design.
The first quantum linear system algorithm was proposed in 2009 by a different group of researchers. That algorithm kick-started research into quantum forms of machine learning, or artificial intelligence.
A linear system algorithm works on a large matrix of data. For example, a trader might be trying to predict the future price of goods. The matrix may capture historical data about price movements over time and data about features that could be influencing these prices, such as currency exchange rates. The algorithm calculates how strongly each feature is correlated with another by ‘inverting’ the matrix. This information can then be used to extrapolate into the future.
“There is a lot of computation involved in analysing the matrix. When it gets beyond say 10,000 by 10,000 entries, it becomes hard for classical computers,” explains Jansen. This is because the number of computational steps goes up rapidly with the number of elements in the matrix: every doubling of the matrix size increases the length of the calculation eight-fold.
The 2009 algorithm could cope better with bigger matrices, but only if the data in them is what’s known as ‘sparse’. In these cases, there are limited relationships among the elements, which is often not true of real-world data.
Jansen, Anupam and Leonard present a new algorithm that is faster than both the classical and the previous quantum versions, without restrictions on the kind of data it works for.
As a rough guide, for a 10,000 square matrix, the classical algorithm would take on the order of a trillion computational steps, the first quantum algorithm some 10,000s of steps and the new quantum algorithm just 100s of steps. The algorithm relies on a technique known as quantum singular value estimation.
There have been a few proof-of-principle demonstrations of the earlier quantum linear system algorithm on small-scale quantum computers. Jansen and his colleagues hope to work with an experimental group to run a proof-of-principle demonstration of their algorithm, too. They also want to do a full analysis of the effort required to implement the algorithm, checking what overhead costs there may be.
To show a real quantum advantage over the classical algorithms will need bigger quantum computers. Jansen estimates that “We’re maybe looking at three to five years in the future when we can actually use the hardware built by the experimentalists to do meaningful quantum computation with application in artificial intelligence.”
Learn more: Quantum algorithm could help AI think faster
The Latest on: Machine learning
via Google News
The Latest on: Machine learning
- Beckhoff Automation announces TwinCAT 3 Machine Learning solution on April 17, 2019 at 12:15 pm
With machine learning, the automation controller can learn the desired ... This allows ML solutions to use amounts of data, for example, for complex sensor data fusion (data merging), and it also ... […]
- Edge computing on April 17, 2019 at 10:05 am
Then real-time analysis and machine learning can use this IoT data to recognize a dangerous state and trigger an immediate shutdown. There are no delays caused by shuttling sensor data to the cloud ... […]
- Adjust is now using machine learning and AI to stop in-app bots on April 17, 2019 at 8:54 am
Fraud preventer and measurement company Adjust has launched a standalone product to help stop in-app bot fraud by using machine learning, AI and natural user flow. The company says its Unbotify ... […]
- Machine learning for all areas of automation on April 17, 2019 at 8:26 am
This machine learning (ML) approach seamlessly integrates into TwinCAT ... This allows ML solutions to use immense amounts of data, for example, for complex sensor data fusion (data merging), and it ... […]
- Is Lidar Necessary For Self-Driving Cars? Audi Seems To Think So on April 17, 2019 at 6:07 am
Aeva was one of several LiDAR sensor solutions that Audi AID has been evaluating ... Even if a company decides to go with an end-to-end machine learning self-driving platform, there isn't any reason ... […]
- Using Semi-Supervised Machine Learning in Laser Powder-bed Fusion Fault Detection on April 17, 2019 at 1:13 am
The scientists have created a machine learning algorithm ... using ‘high precision photodiodes,’ a type of very sensitive sensor able to process measurements to assess quality. […]
- Bridging HPC and Cloud Native Development with Kubernetes on April 16, 2019 at 11:58 am
For example, to incorporate IoT sensor information into simulations, provide rich interactive analytics to discover patterns in data, or train and deploy machine learning and deep learning models and ... […]
- Protecting U.S. troops: Florida Tech developing battlefield rocket-detection system on April 16, 2019 at 7:41 am
We’ve proposed machine-learning techniques to figure out what events are causing these waveforms," Peter said. Florida Tech doctorate student Mitchell Solomon holds an infrasound sensor with team ... […]
- Tool simplifies configuration in machine learning IMUs on April 15, 2019 at 7:47 am
and machine learning (ML) logic that exists in some of its inertial measurement units (IMUs). “The FSM logic lets users run gesture and motion-recognition algorithms directly in the sensor for ... […]
via Bing News