A team of scientists from the Moscow Institute of Physics and Technology (MIPT) have created prototypes of “electronic synapses” based on ultra-thin films of hafnium oxide (HfO2). These prototypes could potentially be used in fundamentally new computing systems.
The paper has been published in the journal Nanoscale Research Letters.
The group of researchers from MIPT have made HfO2-based memristors measuring just 40×40 nm2. The nanostructures they built exhibit properties similar to biological synapses. Using newly developed technology, the memristors were integrated in matrices: in the future this technology may be used to design computers that function similar to biological neural networks.
Memristors (resistors with memory) are devices that are able to change their state (conductivity) depending on the charge passing through them, and they therefore have a memory of their “history”. In this study, the scientists used devices based on thin-film hafnium oxide, a material that is already used in the production of modern processors. This means that this new lab technology could, if required, easily be used in industrial processes.
“In a simpler version, memristors are promising binary non-volatile memory cells, in which information is written by switching the electric resistance – from high to low and back again. What we are trying to demonstrate are much more complex functions of memristors – that they behave similar to biological synapses,” said Yury Matveyev, the corresponding author of the paper, and senior researcher of MIPT’s Laboratory of Functional Materials and Devices for Nanoelectronics, commenting on the study.
Synapses – the key to learning and memory
A synapse is point of connection between neurons, the main function of which is to transmit a signal (a spike – a particular type of signal, see fig. 2) from one neuron to another. Each neuron may have thousands of synapses, i.e. connect with a large number of other neurons. This means that information can be processed in parallel, rather than sequentially (as in modern computers). This is the reason why “living” neural networks are so immensely effective both in terms of speed and energy consumption in solving large range of tasks, such as image / voice recognition, etc.
Over time, synapses may change their “weight”, i.e. their ability to transmit a signal. This property is believed to be the key to understanding the learning and memory functions of the brain.
From the physical point of view, synaptic “memory” and “learning” in the brain can be interpreted as follows: the neural connection possesses a certain “conductivity”, which is determined by the previous “history” of signals that have passed through the connection. If a synapse transmits a signal from one neuron to another, we can say that it has high “conductivity”, and if it does not, we say it has low “conductivity”. However, synapses do not simply function in on/off mode; they can have any intermediate “weight” (intermediate conductivity value). Accordingly, if we want to simulate them using certain devices, these devices will also have to have analogous characteristics.
The memristor as an analogue of the synapse
As in a biological synapse, the value of the electrical conductivity of a memristor is the result of its previous “life” – from the moment it was made.
There is a number of physical effects that can be exploited to design memristors. In this study, the authors used devices based on ultrathin-film hafnium oxide, which exhibit the effect of soft (reversible) electrical breakdown under an applied external electric field. Most often, these devices use only two different states encoding logic zero and one. However, in order to simulate biological synapses, a continuous spectrum of conductivities had to be used in the devices.
“The detailed physical mechanism behind the function of the memristors in question is still debated. However, the qualitative model is as follows: in the metal–ultrathin oxide–metal structure, charged point defects, such as vacancies of oxygen atoms, are formed and move around in the oxide layer when exposed to an electric field. It is these defects that are responsible for the reversible change in the conductivity of the oxide layer,” says the co-author of the paper and researcher of MIPT’s Laboratory of Functional Materials and Devices for Nanoelectronics, Sergey Zakharchenko.
The authors used the newly developed “analogue” memristors to model various learning mechanisms (“plasticity”) of biological synapses. In particular, this involved functions such as long-term potentiation (LTP) or long-term depression (LTD) of a connection between two neurons. It is generally accepted that these functions are the underlying mechanisms of memory in the brain.
The authors also succeeded in demonstrating a more complex mechanism –spike-timing-dependent plasticity, i.e. the dependence of the value of the connection between neurons on the relative time taken for them to be “triggered”. It had previously been shown that this mechanism is responsible for associative learning – the ability of the brain to find connections between different events.
To demonstrate this function in their memristor devices, the authors purposefully used an electric signal which reproduced, as far as possible, the signals in living neurons, and they obtained a dependency very similar to those observed in living synapses (see fig. 3).
Fig.3. The change in conductivity of memristors depending on the temporal separation between “spikes”(rigth) and thr change in potential of the neuron connections in biological neural networks.
Source: MIPT press office
These results allowed the authors to confirm that the elements that they had developed could be considered a prototype of the “electronic synapse”, which could be used as a basis for the hardware implementation of artificial neural networks.
“We have created a baseline matrix of nanoscale memristors demonstrating the properties of biological synapses. Thanks to this research, we are now one step closer to building an artificial neural network. It may only be the very simplest of networks, but it is nevertheless a hardware prototype,” said the head of MIPT’s Laboratory of Functional Materials and Devices for Nanoelectronics, Andrey Zenkevich.
The Latest on: Artificial neural network
via Google News
The Latest on: Artificial neural network
- Brain-inspired replay for continual learning with artificial neural networkson August 13, 2020 at 2:14 am
One challenge that faces artificial intelligence is the inability of deep neural networks to continuously learn new information without catastrophically forgetting what has been learnt before. To ...
- The Future Has Come, Artificial Intelligence in Jiangbeion August 12, 2020 at 6:49 pm
The concept of "artificial intelligence" was put forward for the first time in 1956, launching the rapid development of artificial intelligence science for a good half-century. Obscure words such as ...
- Artificial intelligence drug discovery startup Atomwise raises $123Mon August 11, 2020 at 6:26 am
Artificial intelligence drug discovery startup Atomwise Inc. announced today it has raised $123 million in new funding to scale up its AI technology platform. The Series B round was led by B Capital ...
- Artificial intelligence sheds light on membrane performanceon August 10, 2020 at 6:54 am
Membrane separations have long been recognized as energy-efficient processes with a rapidly growing market. In particular, organic solvent nanofiltration (OSN) technology has shown considerable ...
- A comparative analysis of artificial neural networks and wavelet hybrid approaches to long-term toxic heavy metal predictionon August 10, 2020 at 2:10 am
The occurrence of toxic metals in the aquatic environment is as caused by a variety of contaminations which makes difficulty in the concentration prediction. In this study, conventional methods of ...
- Neural-Network Compiler Adds a Glow to Microson August 7, 2020 at 9:29 am
To give a boost to machine-learning functionality in its MCUs and DSPs, NXP incorporated the open-source Glow neural-network compiler to the technology mix.
- CSIRO using artificial intelligence to map 1.7m Australian grain paddockson August 6, 2020 at 5:55 pm
It developed ePaddocks for the agriculture sector to better understand the boundaries of grain paddocks across the country.
- Neural Network Software Market 2020: key Vendors, Trends, Analysis, Segmentation, Forecast to 2025on August 6, 2020 at 4:42 pm
The Neural Network Software Market is expected to be around US$ 53 Billion by 2025 at a CAGR of 33% in the given forecast period. You will get latest updated report as per the COVID-19 Impact on this ...
- Deep learning on cell signaling networks establishes AI for single-cell biologyon August 4, 2020 at 8:45 am
Computer systems that emulate key aspects of human problem solving are commonly referred to as artificial intelligence (AI). This field has seen massive progress over the last years. Most notably, ...
- Neural Networks Need Naps, Just Like Youon August 3, 2020 at 6:09 am
Researchers at Los Alamos National Laboratory have discovered that neural networks benefit from periods of downtime, just like humans need a good night's r ...
via Bing News