Score: 0

Energy-Aware LLMs: A step towards sustainable AI for downstream applications

Published: March 22, 2025 | arXiv ID: 2503.17783v1

By: Nguyen Phuc Tran, Brigitte Jaumard, Oscar Delgado

Potential Business Impact:

Saves energy while making AI smarter.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Advanced Large Language Models (LLMs) have revolutionized various fields, including communication networks, sparking an innovation wave that has led to new applications and services, and significantly enhanced solution schemes. Despite all these impressive developments, most LLMs typically require huge computational resources, resulting in terribly high energy consumption. Thus, this research study proposes an end-to-end pipeline that investigates the trade-off between energy efficiency and model performance for an LLM during fault ticket analysis in communication networks. It further evaluates the pipeline performance using two real-world datasets for the tasks of root cause analysis and response feedback in a communication network. Our results show that an appropriate combination of quantization and pruning techniques is able to reduce energy consumption while significantly improving model performance.

Country of Origin
🇨🇦 Canada

Page Count
6 pages

Category
Computer Science:
Performance