CQT researchers and their collaborator present a quantum speed-up for machine learning
One of the ways that computers ‘think’ is by analysing relationships within large sets of data. CQT’s Jansen (Zhikuan) Zhao, Anupam Prakash and their collaborator have shown that quantum computers can do one such analysis faster than classical computers, for a wider array of data types than was previously expected.
The team’s proposed ‘quantum linear system algorithm’ is published in the 2 February issue of Physical Review Letters. In the future, it could help crunch numbers on problems as varied as commodities pricing, social networks and chemical structures.
“The previous quantum algorithm of this kind applied to a very specific type of problem. We need an upgrade if we want to achieve a quantum speed up for other data,” says Jansen, who is corresponding author on the work.
That’s exactly what the team is offering. The CQT researchers began collaborating with Leonard Wossnig when he visited the Centre. He was then a Master’s Student at ETH Zurich. Jansen is a PhD student, and Anupam is a research fellow. Jansen’s PhD is with the Singapore University of Technology and Design.
The first quantum linear system algorithm was proposed in 2009 by a different group of researchers. That algorithm kick-started research into quantum forms of machine learning, or artificial intelligence.
A linear system algorithm works on a large matrix of data. For example, a trader might be trying to predict the future price of goods. The matrix may capture historical data about price movements over time and data about features that could be influencing these prices, such as currency exchange rates. The algorithm calculates how strongly each feature is correlated with another by ‘inverting’ the matrix. This information can then be used to extrapolate into the future.
“There is a lot of computation involved in analysing the matrix. When it gets beyond say 10,000 by 10,000 entries, it becomes hard for classical computers,” explains Jansen. This is because the number of computational steps goes up rapidly with the number of elements in the matrix: every doubling of the matrix size increases the length of the calculation eight-fold.
The 2009 algorithm could cope better with bigger matrices, but only if the data in them is what’s known as ‘sparse’. In these cases, there are limited relationships among the elements, which is often not true of real-world data.
Jansen, Anupam and Leonard present a new algorithm that is faster than both the classical and the previous quantum versions, without restrictions on the kind of data it works for.
As a rough guide, for a 10,000 square matrix, the classical algorithm would take on the order of a trillion computational steps, the first quantum algorithm some 10,000s of steps and the new quantum algorithm just 100s of steps. The algorithm relies on a technique known as quantum singular value estimation.
There have been a few proof-of-principle demonstrations of the earlier quantum linear system algorithm on small-scale quantum computers. Jansen and his colleagues hope to work with an experimental group to run a proof-of-principle demonstration of their algorithm, too. They also want to do a full analysis of the effort required to implement the algorithm, checking what overhead costs there may be.
To show a real quantum advantage over the classical algorithms will need bigger quantum computers. Jansen estimates that “We’re maybe looking at three to five years in the future when we can actually use the hardware built by the experimentalists to do meaningful quantum computation with application in artificial intelligence.”
Learn more: Quantum algorithm could help AI think faster
The Latest on: Machine learning
- Bing Ads gives advertisers machine-learning powered insights on September 18, 2018 at 12:46 pm
Bing Ads is rolling out the first set in a series of machine learning-powered insights advertisers will find in the web interface to help them analyze performance and take action more quickly in ... […]
- Machine-learning system tackles speech and object recognition, all at once on September 18, 2018 at 10:32 am
MIT computer scientists have developed a system that learns to identify objects within an image, based on a spoken description of the image. Given an image and an audio caption, the model will highlig... […]
- Don't Ignore Apple's Machine-Learning Chops on September 18, 2018 at 9:25 am
One of the things I learned very early on in my limited relationship with Steve Jobs was that he was a control freak. And while this got him fired from Apple in 1985, it served him well in one key are... […]
- Big Data and Machine Learning Won’t Save Us from Another Financial Crisis on September 18, 2018 at 8:16 am
Ten years on from the financial crisis, stock markets are regularly reaching new highs and volatility levels new lows. The financial industry has enthusiastically and profitably embraced big data ... […]
- Machine learning predicts risk of aneurysm on September 18, 2018 at 8:03 am
Researchers used machine learning to develop a method of predicting which people are at risk of developing an abdominal aneurysm. The findings advance understanding of this common disease and could le... […]
- Survey Foretells Explosive Growth in Machine Learning Projects Over Next Two Years on September 18, 2018 at 7:22 am
Over at the Univa Blog, Gary Tyreman writes that the company sponsored an industry-wide survey to better understand what key challenges our HPC users are currently facing that are preventing them from ... […]
- Why machine learning will see explosive growth over the next 2 years on September 18, 2018 at 7:21 am
Some 96% of companies expect to see an explosion of machine learning projects in production by 2020, according to a Tuesday Univa report. While 93% of respondents reported that they had initiated mach... […]
- LLNL uses machine learning to prevent defects in metal 3D printed parts in real time on September 18, 2018 at 6:58 am
Engineers and scientists at Lawrence Livermore National Laboratory (LLNL), California, have developed convolutional neural networks (CNNs), a popular type of algorithm primarily used to process images ... […]
- Machine Learning Method Sheds Light on Cell Organization on September 18, 2018 at 6:06 am
Researchers at the Allen Institute have used machine learning to train computers to see parts of the cell the human eye cannot easily distinguish. Using 3D images of fluorescently labeled cells, the t... […]
- The State of Machine Learning in Business Today on September 17, 2018 at 4:42 pm
Artificial Intelligence (AI), Machine Learning, and Deep Learning are all topics of considerable interest in news articles and industry discussions these days. However, to the average person or to sen... […]
via Google News and Bing News