An Energy-Efficient RFET-Based Stochastic Computing Neural Network Accelerator
By: Sheng Lu , Qianhou Qu , Sungyong Jung and more
Potential Business Impact:
Makes computer brains use less power and space.
Stochastic computing (SC) offers significant reductions in hardware complexity for traditional convolutional neural networks (CNNs), but stochastic computing neural networks (SCNNs) still suffer from high resource usage due to components such as stochastic number generators (SNGs) and accumulative parallel counters (APCs), which limit performance. This paper introduces a novel SCNN architecture based on reconfigurable field-effect transistors (RFETs), whose device-level reconfigurability enables the design of highly efficient and compact SNGs, APCs, and other core modules. A dedicated SCNN accelerator architecture is also developed for system-level simulation. Using publicly available open-source standard cell libraries, experimental results show that the proposed RFET-based SCNN accelerator achieves substantial reductions in area, latency, and energy consumption compared to a FinFET-based design at the same technology node.
Similar Papers
Implementation of high-efficiency, lightweight residual spiking neural network processor based on field-programmable gate arrays
Neural and Evolutionary Computing
Makes AI chips use less power for faster thinking.
Energy-Efficient Stochastic Computing (SC) Neural Networks for Internet of Things Devices With Layer-Wise Adjustable Sequence Length (ASL)
Machine Learning (CS)
Makes smart devices use less power and run faster.
Sparsity-Aware Streaming SNN Accelerator with Output-Channel Dataflow for Automatic Modulation Classification
Hardware Architecture
Makes wireless signals smarter and faster.