Efficient and Fault-Tolerant Memristive Neural Networks with In-Situ Training
By: Santlal Prajapat, Manobendra Nath Mondal, Susmita Sur-Kolay
Potential Business Impact:
Makes computers learn faster and use less power.
Neuromorphic architectures, which incorporate parallel and in-memory processing, are crucial for accelerating artificial neural network (ANN) computations. This work presents a novel memristor-based multi-layer neural network (memristive MLNN) architecture and an efficient in-situ training algorithm. The proposed design performs matrix-vector multiplications, outer products, and weight updates in constant time $\mathcal{O}(1)$, leveraging the inherent parallelism of memristive crossbars. Each synapse is realized using a single memristor, eliminating the need for transistors, and offering enhanced area and energy efficiency. The architecture is evaluated through LTspice simulations on the IRIS, NASA Asteroid, and Breast Cancer Wisconsin datasets, achieving classification accuracies of 98.22\%, 90.43\%, and 98.59\%, respectively. Robustness is assessed by introducing stuck-at-conducting-state faults in randomly selected memristors. The effects of nonlinearity in memristor conductance and a 10\% device variation are also analyzed. The simulation results establish that the network's performance is not affected significantly by faulty memristors, non-linearity, and device variation.
Similar Papers
Efficient Memristive Spiking Neural Networks Architecture with Supervised In-Situ STDP Method
Emerging Technologies
Makes smart gadgets use way less power.
Memristor-Based Neural Network Accelerators for Space Applications: Enhancing Performance with Temporal Averaging and SIRENs
Systems and Control
Helps spacecraft AI learn and work better.
Fault-Free Analog Computing with Imperfect Hardware
Emerging Technologies
Makes computers work even with broken parts.