TinyDéjàVu: Smaller Memory Footprint & Faster Inference on Sensor Data Streams with Always-On Microcontrollers
By: Zhaolan Huang, Emmanuel Baccelli
Potential Business Impact:
Saves power for tiny smart sensors.
Always-on sensors are increasingly expected to embark a variety of tiny neural networks and to continuously perform inference on time-series of the data they sense. In order to fit lifetime and energy consumption requirements when operating on battery, such hardware uses microcontrollers (MCUs) with tiny memory budget e.g., 128kB of RAM. In this context, optimizing data flows across neural network layers becomes crucial. In this paper, we introduce TinyDéjàVu, a new framework and novel algorithms we designed to drastically reduce the RAM footprint required by inference using various tiny ML models for sensor data time-series on typical microcontroller hardware. We publish the implementation of TinyDéjàVu as open source, and we perform reproducible benchmarks on hardware. We show that TinyDéjàVu can save more than 60% of RAM usage and eliminate up to 90% of redundant compute on overlapping sliding window inputs.
Similar Papers
Tin-Tin: Towards Tiny Learning on Tiny Devices with Integer-based Neural Network Training
Machine Learning (CS)
Lets tiny computers learn without internet.
Evaluating the Energy Efficiency of NPU-Accelerated Machine Learning Inference on Embedded Microcontrollers
Emerging Technologies
Makes tiny computers run smart programs faster, cheaper.
Energy-Efficient Deep Learning for Traffic Classification on Microcontrollers
Networking and Internet Architecture
Helps tiny computers understand internet traffic safely.