Score: 1

Quantifying the Energy Consumption and Carbon Emissions of LLM Inference via Simulations

Published: July 15, 2025 | arXiv ID: 2507.11417v1

By: Miray Özcan , Philipp Wiesner , Philipp Weiß and more

Potential Business Impact:

Makes AI use less electricity and pollution.

The environmental impact of Large Language Models (LLMs) is rising significantly, with inference now accounting for more than half of their total lifecycle carbon emissions. However, existing simulation frameworks, which are increasingly used to determine efficient LLM deployments, lack any concept of power and, therefore, cannot accurately estimate inference-related emissions. We present a simulation framework to assess the energy and carbon implications of LLM inference under varying deployment setups. First, we extend a high-fidelity LLM inference simulator with a GPU power model that estimates power consumption based on utilization metrics, enabling analysis across configurations like batch size, sequence length, and model parallelism. Second, we integrate simulation outputs into an energy system co-simulation environment to quantify carbon emissions under specific grid conditions and explore the potential of carbon-aware scheduling. Through scenario-based analysis, our framework reveals how inference parameters affect energy demand and carbon footprint, demonstrates a renewable offset potential of up to 69.2% in an illustrative deployment case, and provides a foundation for future carbon-aware inference infrastructure design.

Country of Origin
🇩🇪 Germany

Repos / Data Links

Page Count
12 pages

Category
Computer Science:
Distributed, Parallel, and Cluster Computing