Score: 0

An Energy-Efficient RFET-Based Stochastic Computing Neural Network Accelerator

Published: December 6, 2025 | arXiv ID: 2512.22131v1

By: Sheng Lu , Qianhou Qu , Sungyong Jung and more

Potential Business Impact:

Makes computer brains use less power and space.

Business Areas:
Field-Programmable Gate Array (FPGA) Hardware

Stochastic computing (SC) offers significant reductions in hardware complexity for traditional convolutional neural networks (CNNs), but stochastic computing neural networks (SCNNs) still suffer from high resource usage due to components such as stochastic number generators (SNGs) and accumulative parallel counters (APCs), which limit performance. This paper introduces a novel SCNN architecture based on reconfigurable field-effect transistors (RFETs), whose device-level reconfigurability enables the design of highly efficient and compact SNGs, APCs, and other core modules. A dedicated SCNN accelerator architecture is also developed for system-level simulation. Using publicly available open-source standard cell libraries, experimental results show that the proposed RFET-based SCNN accelerator achieves substantial reductions in area, latency, and energy consumption compared to a FinFET-based design at the same technology node.

Country of Origin
🇺🇸 United States

Page Count
12 pages

Category
Computer Science:
Hardware Architecture