Optimizing Neural Networks with Learnable Non-Linear Activation Functions via Lookup-Based FPGA Acceleration
By: Mengyuan Yin , Benjamin Chen Ming Choong , Chuping Qu and more
Potential Business Impact:
Makes smart devices run faster and use less power.
Learned activation functions in models like Kolmogorov-Arnold Networks (KANs) outperform fixed-activation architectures in terms of accuracy and interpretability; however, their computational complexity poses critical challenges for energy-constrained edge AI deployments. Conventional CPUs/GPUs incur prohibitive latency and power costs when evaluating higher order activations, limiting deployability under ultra-tight energy budgets. We address this via a reconfigurable lookup architecture with edge FPGAs. By coupling fine-grained quantization with adaptive lookup tables, our design minimizes energy-intensive arithmetic operations while preserving activation fidelity. FPGA reconfigurability enables dynamic hardware specialization for learned functions, a key advantage for edge systems that require post-deployment adaptability. Evaluations using KANs - where unique activation functions play a critical role - demonstrate that our FPGA-based design achieves superior computational speed and over $10^4$ times higher energy efficiency compared to edge CPUs and GPUs, while maintaining matching accuracy and minimal footprint overhead. This breakthrough positions our approach as a practical enabler for energy-critical edge AI, where computational intensity and power constraints traditionally preclude the use of adaptive activation networks.
Similar Papers
KANELÉ: Kolmogorov-Arnold Networks for Efficient LUT-based Evaluation
Hardware Architecture
Makes smart chips learn faster and use less power.
Hardware Acceleration of Kolmogorov-Arnold Network (KAN) in Large-Scale Systems
Hardware Architecture
Makes AI smarter with less computer parts.
QuantKAN: A Unified Quantization Framework for Kolmogorov Arnold Networks
Machine Learning (CS)
Makes smart computer brains smaller and faster.