LogHD: Robust Compression of Hyperdimensional Classifiers via Logarithmic Class-Axis Reduction
By: Sanggeon Yun , Hyunwoo Oh , Ryozo Masukawa and more
Potential Business Impact:
Makes computers remember more with less energy.
Hyperdimensional computing (HDC) suits memory, energy, and reliability-constrained systems, yet the standard "one prototype per class" design requires $O(CD)$ memory (with $C$ classes and dimensionality $D$). Prior compaction reduces $D$ (feature axis), improving storage/compute but weakening robustness. We introduce LogHD, a logarithmic class-axis reduction that replaces the $C$ per-class prototypes with $n\!\approx\!\lceil\log_k C\rceil$ bundle hypervectors (alphabet size $k$) and decodes in an $n$-dimensional activation space, cutting memory to $O(D\log_k C)$ while preserving $D$. LogHD uses a capacity-aware codebook and profile-based decoding, and composes with feature-axis sparsification. Across datasets and injected bit flips, LogHD attains competitive accuracy with smaller models and higher resilience at matched memory. Under equal memory, it sustains target accuracy at roughly $2.5$-$3.0\times$ higher bit-flip rates than feature-axis compression; an ASIC instantiation delivers $498\times$ energy efficiency and $62.6\times$ speedup over an AMD Ryzen 9 9950X and $24.3\times$/$6.58\times$ over an NVIDIA RTX 4090, and is $4.06\times$ more energy-efficient and $2.19\times$ faster than a feature-axis HDC ASIC baseline.
Similar Papers
DecoHD: Decomposed Hyperdimensional Classification under Extreme Memory Budgets
Machine Learning (CS)
Makes smart computer brains much smaller and faster.
DPQ-HD: Post-Training Compression for Ultra-Low Power Hyperdimensional Computing
Machine Learning (CS)
Makes smart devices work faster with less power.
HD3C: Efficient Medical Data Classification for Embedded Devices
Machine Learning (CS)
Helps small devices diagnose sickness using less power.