Breaking the ICE: Exploring promises and challenges of benchmarks for Inference Carbon & Energy estimation for LLMs
By: Samarth Sikand , Rohit Mehra , Priyavanshi Pathania and more
Potential Business Impact:
Tracks computer's energy use to help the planet.
While Generative AI stands to be one of the fastest adopted technologies ever, studies have made evident that the usage of Large Language Models (LLMs) puts significant burden on energy grids and our environment. It may prove a hindrance to the Sustainability goals of any organization. A crucial step in any Sustainability strategy is monitoring or estimating the energy consumption of various components. While there exist multiple tools for monitoring energy consumption, there is a dearth of tools/frameworks for estimating the consumption or carbon emissions. Current drawbacks of both monitoring and estimation tools include high input data points, intrusive nature, high error margin, etc. We posit that leveraging emerging LLM benchmarks and related data points can help overcome aforementioned challenges while balancing accuracy of the emission estimations. To that extent, we discuss the challenges of current approaches and present our evolving framework, R-ICE, which estimates prompt level inference carbon emissions by leveraging existing state-of-the-art(SOTA) benchmark. This direction provides a more practical and non-intrusive way to enable emerging use-cases like dynamic LLM routing, carbon accounting, etc. Our promising validation results suggest that benchmark-based modelling holds great potential for inference emission estimation and warrants further exploration from the scientific community.
Similar Papers
Quantifying the Energy Consumption and Carbon Emissions of LLM Inference via Simulations
Distributed, Parallel, and Cluster Computing
Makes AI use less electricity and pollution.
How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference
Computers and Society
Measures AI's energy use and pollution.
Optimizing Large Language Models: Metrics, Energy Efficiency, and Case Study Insights
Machine Learning (CS)
Cuts AI's energy use by almost half.