Score: 1

EdgeLDR: Quaternion Low-Displacement Rank Neural Networks for Edge-Efficient Deep Learning

Published: January 8, 2026 | arXiv ID: 2601.05379v1

By: Vladimir Frants, Sos Agaian, Karen Panetta

Potential Business Impact:

Makes AI run faster on small devices.

Business Areas:
Quantum Computing Science and Engineering

Deploying deep neural networks on edge devices is often limited by the memory traffic and compute cost of dense linear operators. While quaternion neural networks improve parameter efficiency by coupling multiple channels through Hamilton products, they typically retain unstructured dense weights; conversely, structured matrices enable fast computation but are usually applied in the real domain. This paper introduces EdgeLDR, a practical framework for quaternion block-circulant linear and convolutional layers that combines quaternion channel mixing with block-circulant parameter structure and enables FFT-based evaluation through the complex adjoint representation. We present reference implementations of EdgeLDR layers and compare FFT-based computation against a naive spatial-domain realization of quaternion circulant products. FFT evaluation yields large empirical speedups over the naive implementation and keeps latency stable as block size increases, making larger compression factors computationally viable. We further integrate EdgeLDR layers into compact CNN and Transformer backbones and evaluate accuracy-compression trade-offs on 32x32 RGB classification (CIFAR-10/100, SVHN) and hyperspectral image classification (Houston 2013, Pavia University), reporting parameter counts and CPU/GPU latency. The results show that EdgeLDR layers provide significant compression with competitive accuracy.

Country of Origin
🇺🇸 United States

Page Count
12 pages

Category
Computer Science:
CV and Pattern Recognition