An efficient probabilistic hardware architecture for diffusion-like models
By: Andraž Jelinčič , Owen Lockwood , Akhil Garlapati and more
Potential Business Impact:
Makes computers use way less power for smart tasks.
The proliferation of probabilistic AI has promoted proposals for specialized stochastic computers. Despite promising efficiency gains, these proposals have failed to gain traction because they rely on fundamentally limited modeling techniques and exotic, unscalable hardware. In this work, we address these shortcomings by proposing an all-transistor probabilistic computer that implements powerful denoising models at the hardware level. A system-level analysis indicates that devices based on our architecture could achieve performance parity with GPUs on a simple image benchmark using approximately 10,000 times less energy.
Similar Papers
Spintronic Bayesian Hardware Driven by Stochastic Magnetic Domain Wall Dynamics
Applied Physics
Makes AI smarter and more trustworthy.
Resource-Efficient and Robust Inference of Deep and Bayesian Neural Networks on Embedded and Analog Computing Platforms
Machine Learning (CS)
Makes smart computers work faster and more reliably.
Low-rank surrogate modeling and stochastic zero-order optimization for training of neural networks with black-box layers
Machine Learning (CS)
Makes AI learn faster using light and math.