U-M researchers created a reservoir computing system that reduces training time and improves capacity of similar neural networks.
A new type of neural network made with memristors can dramatically improve the efficiency of teaching machines to think like humans. The network, called a reservoir computing system, could predict words before they are said during conversation, and help predict future outcomes based on the present.
The research team that created the reservoir computing system, led by Wei Lu, U-M professor of electrical engineering and computer science, recently published their work in Nature Communications.
Reservoir computing systems, which improve on a typical neural network’s capacity and reduce the required training time, have been created in the past with larger optical components. However, the U-M group created their system using memristors, which require less space and can be integrated more easily into existing silicon-based electronics.
Memristors are a special type of resistive device that can both perform logic and store data. This contrasts with typical computer systems, where processors perform logic separate from memory modules. In this study, Lu’s team used a special memristor that memorizes events only in the near history.
Inspired by brains, neural networks are composed of neurons, or nodes, and synapses, the connections between nodes.
A lot of times, it takes days or months to train a network. It is very expensive. Wei Lu
To train a neural network for a task, a neural network takes in a large set of questions and the answers to those questions. In this process of what’s called supervised learning, the connections between nodes are weighted more heavily or lightly to minimize the amount of error in achieving the correct answer.
Once trained, a neural network can then be tested without knowing the answer. For example, a system can process a new photo and correctly identify a human face, because it has learned the features of human faces from other photos in its training set.
“A lot of times, it takes days or months to train a network,” says Lu. “It is very expensive.”
Image recognition is also a relatively simple problem, as it doesn’t require any information apart from a static image. More complex tasks, such as speech recognition, can depend highly on context and require neural networks to have knowledge of what has just occurred, or what has just been said.
“When transcribing speech to text or translating languages, a word’s meaning and even pronunciation will differ depending on the previous syllables,” says Lu.
This requires a recurrent neural network, which incorporates loops within the network that give the network a memory effect. However, training these recurrent neural networks is especially expensive, Lu says.
Reservoir computing systems built with memristors, however, can skip most of the expensive training process and still provide the network the capability to remember. This is because the most critical component of the system – the reservoir – does not require training.
When a set of data is inputted into the reservoir, the reservoir identifies important time-related features of the data, and hands it off in a simpler format to a second network. This second network then only needs training like simpler neural networks, changing weights of the features and outputs that the first network passed on until it achieves an acceptable level of error.
“The beauty of reservoir computing is that while we design it, we don’t have to train it,” says Lu.
The team proved the reservoir computing concept using a test of handwriting recognition, a common benchmark among neural networks. Numerals were broken up into rows of pixels, and fed into the computer with voltages like Morse code, with zero volts for a dark pixel and a little over one volt for a white pixel.
Using only 88 memristors as nodes to identify handwritten versions of numerals, compared to a conventional network that would require thousands of nodes for the task, the reservoir achieved 91% accuracy.
Reservoir computing systems are especially adept at handling data that varies with time, like a stream of data or words, or a function depending on past results.
To demonstrate this, the team tested a complex function that depended on multiple past results, which is common in engineering fields. The reservoir computing system was able to model the complex function with minimal error.
Lu plans on exploring two future paths with this research: speech recognition and predictive analysis.
“We can make predictions on natural spoken language, so you don’t even have to say the full word,” explains Lu.
“We could actually predict what you plan to say next.”
In predictive analysis, Lu hopes to use the system to take in signals with noise, like static from far-off radio stations, and produce a cleaner stream of data. “It could also predict and generate an output signal even if the input stopped,” he says.
The Latest on: Reservoir computing system
- Computer Modelling Group (TSE:CMG) PT Lowered to C$6.00on March 24, 2020 at 10:26 pm
a reservoir and production system modelling software that allows reservoir and production engineers to collaborate on the same asset. Further Reading: 52 Week Highs Receive News & Ratings for Computer ...
- Organotypic culture as a research and preclinical model to study uterine leiomyomason March 23, 2020 at 3:09 am
Uterine leiomyoma (UL), also called fibroid or myoma, is the most commonly diagnosed tumour of the female genital tract with an incidence of 40% at the age of 35 and nearly 70–80% around the age of 50 ...
- Silt crowding out drinking water in reservoirson March 21, 2020 at 5:00 pm
Dividing up water in a reservoir Kanopolis Lake ... “This is awesome,” he says. On the computer screen in front of him are layers of colors – blues, greens, yellows, oranges and the ...
- Royal Bank of Canada Lowers Computer Modelling Group (TSE:CMG) Price Target to C$5.00on March 20, 2020 at 11:40 pm
Computer Modelling Group (TSE:CMG) had its target price reduced by Royal Bank ... and chemical recovery processes; and CoFlow, a reservoir and production system modelling software that allows ...
- AI-Based Analytics for Oil and Gason March 19, 2020 at 6:16 am
Emerson and Quantum Reservoir Impact (QRI) are working together to develop and market applications for artificial intelligence (AI)-based analytics and decision-making tools customized for oil and gas ...
- Sugar brings a lot of carbon dioxide into the deeper seaon March 18, 2020 at 8:03 am
The oceans are a very important reservoir for carbon in the system of the earth. However, many aspects of the marine carbon cycle are still unknown. Scientists have now found that sugar plays an ...
- An open data infrastructure for the study of anthropogenic hazards linked to georesource exploitationon March 11, 2020 at 3:08 am
Mining, water-reservoir impoundment, underground gas storage, geothermal energy exploitation and hydrocarbon extraction have the potential to cause rock deformation and earthquakes, which may be ...
- North America Digital Oilfield Market Share Global Growth, Opportunities, Industry Analysis & Forecast to 2026on March 9, 2020 at 10:09 am
Market Study Report Provides A Detailed Overview Of North America Digital Oilfield Market With Respect To The Pivotal Drivers Influencing The Revenue Graph Of This Business Sphere. The Current Trends ...
- To spur investment, fix structural issueson March 3, 2020 at 7:44 am
An important consequence of India’s momentous economic reforms in 1991 — which lifted controls on production and liberalised markets — was to unleash the huge reservoir of entrepreneurship ...
- Federal Water Tap, March 2: Breaching Snake River Dams Not Preferred Option in Columbia River Reviewon March 2, 2020 at 3:01 am
Each year, Reclamation provides water based on reservoir storage and snowpack ... The plans are supposed to outline ways of protecting computer systems from external malfeasance. Mark Esper, the ...
via Google News and Bing News