Photo by Inge Hoogland
Thanks to a mathematical breakthrough, AI applications like speech recognition, gesture recognition and ECG classification can become a hundred to a thousand times more energy efficient. This means it will be possible to put much more elaborate AI in chips, enabling applications to run on a smartphone or smartwatch where before this was done in the cloud.
Running the AI on local devices makes the applications more robust and privacy-friendly: robust, because a network connection with the cloud is no longer necessary. And more privacy friendly because data can be stored and processed locally.
The mathematical breakthrough has been achieved by researchers of Centrum Wiskunde & Informatica (CWI), the Dutch national research center for mathematics and computer science together with the IMEC/Holst Research Centre from Eindhoven. The results have been published in a paper (by Bojian Yin, Federico Corradi, and Sander M. Bohté) of the International Conference on Neuromorphic Systems. The underlying mathematical algorithms have been made available open source.
Under supervision of CWI researcher and UvA professor cognitive neurobiology Sander Bohté, researchers developed a learning algorithm for so-called spiking neural networks. Such networks have been around for some time, but are very difficult to handle from a mathematical perspective, making it hard to put them into practice so far. The new algorithm is groundbreaking in two ways: the neurons in the network are required to communicate a lot less frequently, and each individual neuron has to execute fewer calculations.
“The combination of these two breakthroughs make AI algorithms a thousand times more energy efficient in comparison with standard neural networks, and a factor hundred more energy efficient than current state-of-the-art neural networks”, says principal investigator Sander Bohté.
Inspired by the human brain
Bohté’s inspiration and motivation comes from the incredibly energy efficient way that the human brain processes information (20 Watt). Computers that mimic the brain’s neuronal networks have produced wonderful applications in recent years – ranging from image recognition, speech recognition, automatic translation, to medical diagnoses – but require up to a million times more energy than the human brain.
The spiking neural networks developed by Bohté and his research team differ from those already integrated in AI applications. “The communication between neurons in classical neural networks is continuous and easy to handle from a mathematical perspective. Spiking neurons look more like the human brain and communicate only sparingly and with short pulses. This however means that the signals are discontinuous and much more difficult to handle mathematically.”
New type of computer chip
To run spiking neural networks efficiently in the real-world, a new type of chips are needed. Bohté says that prototypes are already being developed. “All kinds of companies are working hard to make this happen, like our project partner IMEC/Holst Centre.”
Bohté’s methods can train spiking neural networks comprised of up to a few thousand neurons, less than typical classical neural networks, but sufficient for many applications like speech recognition, ECG classification and the recognition of gestures. The next challenge will therefore be to scale up these networks to 100.000 or a million neurons, which will expand the application possibilities even further.
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Spiking neural networks
- CWI and Holst make a breakthrough in energy-efficient AIon October 8, 2020 at 8:32 am
This makes it possible to put much more elaborate AI in chips, enabling applications to run on a smartphone or smartwatch where before this was done in the cloud. Because a network connection is no ...
- Breakthrough in energy efficient artificial intelligenceon October 7, 2020 at 11:34 pm
Thanks to a mathematical breakthrough, AI applications can become a hundred to a thousand times more energy efficient. This will make it possible to put much more elaborate AI in chips, enabling ...
- Tenstorrent Is Changing the Way We Think About AI Chipson October 1, 2020 at 5:00 pm
On the flip side you've got the spiking artificial neural network, which is a lot less popular and has had a lot less success in in broad applications.” Spiking neural networks (SNNs) more closely ...
- Adaptive Cardiac Resynchronization Therapy Device: A Simulation Reporton September 20, 2020 at 5:01 pm
The adaptive CRT module is built from a microcontroller, pulse generator module, and a spiking neural network co-processor. The spiking neural network co-processor is the learning module that ...
- Dose-dependent effects of transcranial alternating current stimulation on spike timing in awake nonhuman primateson September 18, 2020 at 1:59 am
These authors contributed equally to this work. See allHide authors and affiliations Weak extracellular electric fields can influence spike timing in neural networks. Approaches to noninvasively ...
Go deeper with Google Headlines on:
Spiking neural networks
Go deeper with Bing News on:
- Maxim Integrated's Neural Network Accelerator Chip Enables IoT Artificial Intelligence in Battery-Powered Deviceson October 8, 2020 at 1:33 am
PRNewswire/ -- The MAX78000 low-power neural network accelerated microcontroller from Maxim Integrated Products, Inc. (NASDAQ: MXIM) moves artificial intelligence (AI) to the edge without performance ...
- Nvidia Wants to Eliminate Compression During Video Calls by Using Neural Networks to Perfectly Render Your Faceon October 7, 2020 at 8:10 pm
There are a thousand reasons to hate hopping on a video call every 10 minutes to chat with your co-workers, among them the ugly video compression artifacts that can occasionally make your face ...
- NVIDIA Research develops a neural network to replace traditional video compressionon October 6, 2020 at 3:13 pm
The AI compression can preserve video quality while using extremely low bandwidth, plus it can rotate the subject and transform them into other characters.
- New Part Day: The RISC-V Chip With Built-In Neural Networkson October 3, 2020 at 5:00 pm
It’s a dual-core, RISC-V chip running at 400MHz. There’s 6 MB of SRAM on the CPU, and there’s 2MB for convolutional neural network acceleration. There is, apparently, WiFi on some versions.
- Neural networks have been used to upscale footage of the Hindenburg disaster to 1080pon September 29, 2020 at 2:38 pm
Neural networking is one of these techniques, and with it, we can gain a fresh understanding of past events, or, as in today’s case, see disaster strike through a modern lens. The Hindenburg was an ...