|Memristive (memory + resistor) arrays are matrix-like structures of a special kind of electrical resistors, with adjustable resistance value. As they allow vector-matrix multiplication or Multiply-Accumulate functions in one single step, they are considered highly intersting for for ANN accelerators.
But can we further exploit this unique feature for neuromorphic hardware systems?
This working group will discuss which and how neuromorphic algorithms can benefit from this fast and "cheap" inference feature, but does not suffer to much from the relatively complex update of memristive devices.
A memristive array consists of a matrix of memristive devices. The horizontal input lines connect to all devices in each row of the matrix and the vertical output lines connect all devices in each column. If a voltage vector is applied to the input lines, each memristive device will generate a current from the respective input vltage and its current conductance based on Ohm's Law. All generated currents are summed up column-wise by Kirchhoff's Law. This allows for a Multiply & Accumulate function (MAC) in a single shot.
In conventional ANNs this offers great benefits in the inference of the network as it speeds up the MAC operation which is usually computation and memory intensive. But it can also be applied in the backpropagation step.
For neuromorphic hardware one could imagine a memristive array to act as an all-to-all connection between neurons without the need of an address-based routing. Furthermore, a possible binary operation of the memristive devices could be used to represent structural plasticity in a network of spiking neurons.
The final goal would be to indentify applications of memristive arrays in neuromorphic hardware that make use of its massive inference capability, investigate how its performance can benefit, and define boundaries for the co-integrated circuitry.