Score: 0

Online Learning Extreme Learning Machine with Low-Complexity Predictive Plasticity Rule and FPGA Implementation

Published: December 25, 2025 | arXiv ID: 2512.21777v1

By: Zhenya Zang, Xingda Li, David Day Uei Li

Potential Business Impact:

Teaches computers faster with less power.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

We propose a simplified, biologically inspired predictive local learning rule that eliminates the need for global backpropagation in conventional neural networks and membrane integration in event-based training. Weight updates are triggered only on prediction errors and are performed using sparse, binary-driven vector additions. We integrate this rule into an extreme learning machine (ELM), replacing the conventional computationally intensive matrix inversion. Compared to standard ELM, our approach reduces the complexity of the training from O(M^3) to O(M), in terms of M nodes in the hidden layer, while maintaining comparable accuracy (within 3.6% and 2.0% degradation on training and test datasets, respectively). We demonstrate an FPGA implementation and compare it with existing studies, showing significant reductions in computational and memory requirements. This design demonstrates strong potential for energy-efficient online learning on low-cost edge devices.

Country of Origin
🇬🇧 United Kingdom

Page Count
5 pages

Category
Computer Science:
Hardware Architecture