Kernel-Level Energy-Efficient Neural Architecture Search for Tabular Dataset
By: Hoang-Loc La, Phuong Hoai Ha
Potential Business Impact:
Finds computer designs that use way less power.
Many studies estimate energy consumption using proxy metrics like memory usage, FLOPs, and inference latency, with the assumption that reducing these metrics will also lower energy consumption in neural networks. This paper, however, takes a different approach by introducing an energy-efficient Neural Architecture Search (NAS) method that directly focuses on identifying architectures that minimize energy consumption while maintaining acceptable accuracy. Unlike previous methods that primarily target vision and language tasks, the approach proposed here specifically addresses tabular datasets. Remarkably, the optimal architecture suggested by this method can reduce energy consumption by up to 92% compared to architectures recommended by conventional NAS.
Similar Papers
ZeroLM: Data-Free Transformer Architecture Search for Language Models
Computation and Language
Finds best computer brains faster and cheaper.
A Continuous Encoding-Based Representation for Efficient Multi-Fidelity Multi-Objective Neural Architecture Search
Machine Learning (CS)
Finds best computer designs faster for complex jobs.
Neural Architecture Search Algorithms for Quantum Autoencoders
Quantum Physics
Finds best quantum computer programs automatically.