Holistically Evaluating the Environmental Impact of Creating Language Models
By: Jacob Morrison , Clara Na , Jared Fernandez and more
Potential Business Impact:
Shows how much pollution AI makes.
As the performance of artificial intelligence systems has dramatically increased, so too has the environmental impact of creating these systems. While many model developers release estimates of the power consumption and carbon emissions from the final training runs for their latest models, there is comparatively little transparency into the impact of model development, hardware manufacturing, and total water usage throughout. In this work, we estimate the real-world environmental impact of developing a series of language models, ranging from 20 million to 13 billion active parameters, trained on up to 5.6 trillion tokens each. When accounting for hardware manufacturing, model development, and our final training runs, we find that our series of models released 493 metric tons of carbon emissions, equivalent to powering about 98 homes in the United States for one year, and consumed 2.769 million liters of water, equivalent to about 24.5 years of water usage by a person in the United States, even though our data center is extremely water-efficient. We measure and report the environmental impact of our model development; to the best of our knowledge we are the first to do so for LLMs, and we find that model development, the impact of which is generally not disclosed by most model developers, amounted to ~50% of that of training. By looking at detailed time series data for power consumption, we also find that power usage throughout training is not consistent, fluctuating between ~15% and ~85% of our hardware's maximum power draw, with negative implications for grid-scale planning as demand continues to grow. We close with a discussion on the continued difficulty of estimating the environmental impact of AI systems, and key takeaways for model developers and the public at large.
Similar Papers
The Carbon Cost of Conversation, Sustainability in the Age of Language Models
Computers and Society
AI uses lots of energy, harming the planet.
The Hidden AI Race: Tracking Environmental Costs of Innovation
Computers and Society
Finds ways to make AI less harmful to Earth.
How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference
Computers and Society
Measures AI's energy use and pollution.