A battery-like device could act as an artificial synapse within computing systems intended to imitate the brain’s efficiency and ability to learn.
The brain’s capacity for simultaneously learning and memorizing large amounts of information while requiring little energy has inspired an entire field to pursue brain-like – or neuromorphic – computers. Researchers at Stanford University and Sandia National Laboratories previously developed one portion of such a computer: a device that acts as an artificial synapse, mimicking the way neurons communicate in the brain.
In a paper published online by the journal Science on April 25, the team reports that a prototype array of nine of these devices performed even better than expected in processing speed, energy efficiency, reproducibility and durability.
Looking forward, the team members want to combine their artificial synapse with traditional electronics, which they hope could be a step toward supporting artificially intelligent learning on small devices.
“If you have a memory system that can learn with the energy efficiency and speed that we’ve presented, then you can put that in a smartphone or laptop,” said Scott Keene, co-author of the paper and a graduate student in the lab of Alberto Salleo, professor of materials science and engineering at Stanford who is co-senior author. “That would open up access to the ability to train our own networks and solve problems locally on our own devices without relying on data transfer to do so.”
A bad battery, a good synapse
The team’s artificial synapse is similar to a battery, modified so that the researchers can dial up or down the flow of electricity between the two terminals. That flow of electricity emulates how learning is wired in the brain. This is an especially efficient design because data processing and memory storage happen in one action, rather than a more traditional computer system where the data is processed first and then later moved to storage.
Seeing how these devices perform in an array is a crucial step because it allows the researchers to program several artificial synapses simultaneously. This is far less time consuming than having to program each synapse one-by-one and is comparable to how the brain actually works.
In previous tests of an earlier version of this device, the researchers found their processing and memory action requires about one-tenth as much energy as a state-of-the-art computing system needs in order to carry out specific tasks. Still, the researchers worried that the sum of all these devices working together in larger arrays could risk drawing too much power. So, they retooled each device to conduct less electrical current – making them much worse batteries but making the array even more energy efficient.
When we saw everything light up, it was like a Christmas tree. That was the most exciting moment.
Postdoc, Salleo lab
The 3-by-3 array relied on a second type of device – developed by Joshua Yang at the University of Massachusetts, Amherst, who is co-author of the paper – that acts as a switch for programming synapses within the array.
“Wiring everything up took a lot of troubleshooting and a lot of wires. We had to ensure all of the array components were working in concert,” said Armantas Melianas, a postdoctoral scholar in the Salleo lab. “But when we saw everything light up, it was like a Christmas tree. That was the most exciting moment.”
During testing, the array outperformed the researchers’ expectations. It performed with such speed that the team predicts the next version of these devices will need to be tested with special high-speed electronics. After measuring high energy efficiency in the 3-by-3 array, the researchers ran computer simulations of a larger 1024-by-1024 synapse array and estimated that it could be powered by the same batteries currently used in smartphones or small drones. The researchers were also able to switch the devices over a billion times – another testament to its speed – without seeing any degradation in its behavior.
“It turns out that polymer devices, if you treat them well, can be as resilient as traditional counterparts made of silicon. That was maybe the most surprising aspect from my point of view,” Salleo said. “For me, it changes how I think about these polymer devices in terms of reliability and how we might be able to use them.”
Room for creativity
The researchers haven’t yet submitted their array to tests that determine how well it learns but that is something they plan to study. The team also wants to see how their device weathers different conditions – such as high temperatures – and to work on integrating it with electronics. There are also many fundamental questions left to answer that could help the researchers understand exactly why their device performs so well.
“We hope that more people will start working on this type of device because there are not many groups focusing on this particular architecture, but we think it’s very promising,” Melianas said.
The Latest on: Artificially intelligent learning
via Google News
The Latest on: Artificially intelligent learning
- CBSE & IBM tie up for artificial intelligence learning programmeon September 6, 2019 at 4:19 pm
Nagpur: To ensure that students and teachers are ready for advancements in technological revolution in the world, CBSE and IBM India have joined hands to hold training sessions across the country.
- 3 Artificial Intelligence Stocks to Buy: MSFT, UPS and PANWon September 6, 2019 at 3:54 pm
advanced AI and machine learning technology into its offerings. And on the PANW price chart, this artificial intelligence stock is looking set for long-term success. Technically, shares of Palo ...
- Artificial intelligence approaches may improve diagnostics of kidney diseaseon September 6, 2019 at 11:37 am
An editorial that accompanies the two studies highlights the strengths and weaknesses of machine learning. American Society of Nephrology. (2019, September 5). Artificial intelligence approaches may ...
- How to Build Artificial Intelligence We Can Truston September 6, 2019 at 8:45 am
Artificial intelligence has a trust problem ... and better at detecting statistical patterns in data sets — often using an approach known as deep learning — and start building computer systems that ...
- Snapshot of the Emerging ICT Led Innovations in Artificial Intelligence, Machine Learning, Analytics, and Computer Visionon September 6, 2019 at 8:29 am
DUBLIN, Sept. 6, 2019 /PRNewswire/ -- The "Recent Innovations in Information Technology, Computing, and Communications" report has been added to ResearchAndMarkets.com's offering. This provides a ...
- Hike, Artificial Intelligence and Machine Learningon September 6, 2019 at 4:21 am
AI and ML can be used for effective and seamless exchange of communication and expression among close friends. Voice messaging app Hike describes that social products should be joyful. They should be ...
- The Amazing Ways How L’Oréal Uses Artificial Intelligence To Drive Business Performanceon September 5, 2019 at 10:34 pm
Artificial intelligence powers this bot ... focused on reinventing the beauty experience through technologies such as voice, AR and AI.” Using deep learning, the tool uses ModiFace’s AR capabilities ...
- Artificial intelligence helps to predict hybrid nanoparticle structureson September 5, 2019 at 5:06 am
The interesting question regarding applications of artificial intelligence for structural ... for hybrid metal nanoparticles by using machine learning methods. These models will allow us to ...
- Artificial Intelligence As A Service Market Trends, Sales, Supply, Demand, Analysis & Forecast to 2028on September 4, 2019 at 10:55 pm
The global Artificial Intelligence as a Service market is comprehensively ... Segmentation by Technology: Deep Learning Machine Learning (ML) Natural Language Processing (NLP) 1.Detailed overview of ...
- Solving the puzzle of deep learning with GPUs and containerson September 4, 2019 at 4:13 am
Puzzled about how to run your artificial intelligence (AI), machine learning (ML), and deep learning (DL) applications at scale, with maximum performance, and minimum cost? There are lots of cloud ...
via Bing News