Memristive technology has long been attractive for potential use in neuromorphic computing. Among other things it would permit building artificial neural network (ANN) circuits that are processed in parallel and more directly emulate how neuronal circuits in the brain work. Recent work led by researchers at Oak Ridge National Laboratory and the University of Tennessee proposes a mixed signal approach that leverages memristive technology to build better ANNs.
“[Our] mixed-signal approach implements neural networks with spiking events in a synchronous way. Moreover, the use of nano-scale memristive devices saves both area and power in the system… The proposed [system] includes synchronous digital long term plasticity (DLTP), an online learning methodology that helps the system train the neural networks during the operation phase and improves the efficiency in learning considering the power consumption and area overhead,” writes Catherine Schuman, a Liane Russell Early Career Fellow in Computational Data Analytics at Oak Ridge National Laboratory, and colleagues[i].
Their paper, Memristive Mixed-Signal Neuromorphic Systems: Energy-Efficient Learning at the Circuit-Level, was published in the IEEE Journal on Emerging and Selected Topics in Circuits and Systems.
The researchers point out that digital and analog approaches to building ANNs each have drawbacks. While digital implementations have precision, robustness, noise resilience and scalability, they are area intensive. Conversely, analog counterparts are efficient in terms of silicon area and processing speed, but “rely on representing synaptic weights as volatile voltages on capacitors or in resistors, which do not lend themselves to energy and area efficient learning.”
Instead, they propose a mixed-signal system where communication and control is digital while the core multiply-and-accumulate functionality is analog. Researchers used a hafnium-oxide memristor design based on earlier work (“A practical hafnium-oxide memristor model suitable for circuit design and simulation,” in Proceedings of IEEE International Symposium on Circuits and Systems).
Their design (figure two, shown below) consists of m x n memristive neuromorphic cores. “Each core has several memristive synapses and one mixed-signal neuron (analog in, digital out) to implement a spiking neural network. This arrangement helps maintain similar capacitance at the synaptic outputs and corresponding neurons. The similar distance between synapse and inputs also results in negligible difference in charge accumulation,” write the authors.
Also exciting is the researchers’ approach to implementing learning. Most ANNs require offline learning. For a network to learn online, Long Term Plasticity plays an important role in training the circuit with continuous updates of synaptic weights based on the timing of pre- and post-neuron fires.
“Instead of carefully crafting analog tails to provide variation in the voltage across the synapses, we utilize digital pre- and post-neuron firing signals and apply pulse modulation to implement a digital LTP (DLTP) technique…Basically the online learning process implemented here is one clock cycle tracking version of Spike time Dependent Plasticity… A more thorough STDP learning implementation would need to track several clock cycles before and after the post-neuron fire leading to more circuitry and hence increased power and area. Our DLTP approach acts similarly but ensures lower area and power,” write the authors.
Link to paper: http://ieeexplore.ieee.org/document/8119503/
Feature image source: ORNL
[i] Gangotree Chakma, Student Member, IEEE, Md Musabbir Adnan, Student Member, IEEE, Austin R. Wyer, Student Member, IEEE, Ryan Weiss, Student Member, IEEE, Catherine D. Schuman, Member, IEEE, and Garrett S. Rose, Member, IEEEAustin R. Wyer, Student Member, IEEE, Ryan Weiss, Student Member, IEEE, Catherine D. Schuman, Member, IEEE, and Garrett S. Rose, Member, IEEE