Researchers have shown how to write any magnetic pattern desired onto nanowires, which could help computers mimic how the brain processes information.
Much current computer hardware, such as hard drives, use magnetic memory devices. These rely on magnetic states – the direction microscopic magnets are pointing – to encode and read information.
With this new writing method, we open up research into ‘training’ these magnetic nanowires to solve useful problems.
– Dr Jack Gartside
Exotic magnetic states – such as a point where three south poles meet – represent complex systems. These may act in a similar way to many complex systems found in nature, such as the way our brains process information.
Computing systems that are designed to process information in similar ways to our brains are known as ‘neural networks’. There are already powerful software-based neural networks – for example one recently beat the human champion at the game ‘Go’ – but their efficiency is limited as they run on conventional computer hardware.
Now, researchers from Imperial College London have devised a method for writing magnetic information in any pattern desired, using a very small magnetic probe called a magnetic force microscope.
With this new writing method, arrays of magnetic nanowires may be able to function as hardware neural networks – potentially more powerful and efficient than software-based approaches.
The team, from the Departments of Physics and Materials at Imperial, demonstrated their system by writing patterns that have never been seen before. They published their results today in Nature Nanotechnology.
Dr Jack Gartside, first author from the Department of Physics, said: “With this new writing method, we open up research into ‘training’ these magnetic nanowires to solve useful problems. If successful, this will bring hardware neural networks a step closer to reality.”
As well as applications in computing, the method could be used to study fundamental aspects of complex systems, by creating magnetic states that are far from optimal (such as three south poles together) and seeing how the system responds.
The Latest on: Hardware neural networks
- BrainChip Introduces First in a Revolutionary New Breed of Neuromorphic Computing Devices at the Linley Processor Conferenceon October 15, 2019 at 5:53 pm
“Spiking neural networks have many attractive characteristics ... Akida is available as an SNN processing chip, as well as a licensable technology that can be integrated with other hardware and ...
- Watch This Humanoid Robot Hand Solve a Rubik’s Cubeon October 15, 2019 at 12:40 pm
“As the neural network gets better at the task and reaches a performance ... “As an alternative, people have spent many decades trying to use general-purpose robotic hardware, but with limited success ...
- Neural Network Inference at Dramatically Lower Latency Compared to GPUs with Zebra by Mipsology on Xilinx Alveo U50 Acceleratorson October 15, 2019 at 9:10 am
The team devised Zebra - the first technology to accelerate the computations of inference for neural networks on FPGA and conceal the hardware to AI users. Learn more about Mipsology at ...
- ON Semiconductor and AImotive Announce Collaboration on Future Sensor Fusion Hardware Platformson October 15, 2019 at 1:00 am
The platforms will showcase the superior accuracy, robustness and low latency of AI-based real-time sensor fusion and will utilize AImotive’s aiWare hardware NN (Neural Network) acceleration IP in ...
- Tesla announces new ‘Deep Rain’ neural net for its automatic wiperson October 14, 2019 at 10:07 am
Elon Musk announced today that Tesla has a neural network called ‘Deep Rain’ to improve its automatic wiper system ... The CEO said that the new neural net will come as an over the air update and it ...
- Temporal data classification and forecasting using a memristor-based reservoir computing systemon October 14, 2019 at 10:00 am
Reservoir computing, in particular, can offer efficient temporal processing of recurrent neural networks with a low training cost, and is thus well suited to time-series analysis and forecasting tasks ...
- Toolkit supports automotive-quality AIon October 14, 2019 at 4:34 am
NXP explained that its eIQ Auto tool set — designed specifically for NXP’s S32V234 processor — will help AV developers “optimize the embedded hardware development of deep learning algorithms ...
- BrainChip Announces the Akida™ Development Workshopon October 13, 2019 at 6:00 pm
The Akida Development Environment is a full featured suite of tools including TensorFlow and Keras for deep learning neural networks, Brainchip SNN development tool and the Akida hardware simulator.
- AI Hardware: Harder Than It Lookson October 7, 2019 at 12:40 pm
This “DSA” concept applies not only to novel hardware designs but to the new software architecture of deep neural networks. The challenge is to create and train massive neural networks and then ...
- Exclusive: Google Pixel 4, 4 XL specs tease 'Neural Core' - 9to5Googleon October 3, 2019 at 5:04 pm
For the past two months, the hardware of the Google Pixel 4 and ... The new name emphasizes that the chip uses neural network techniques to process images. The rest of the Pixel 4 and Pixel ...
via Google News and Bing News