Energy-Aware Data-Driven Model Selection in LLM-Orchestrated AI Systems
By: Daria Smirnova , Hamid Nasiri , Marta Adamska and more
Potential Business Impact:
Makes AI choose tools better, saving energy and time.
As modern artificial intelligence (AI) systems become more advanced and capable, they can leverage a wide range of tools and models to perform complex tasks. Today, the task of orchestrating these models is often performed by Large Language Models (LLMs) that rely on qualitative descriptions of models for decision-making. However, the descriptions provided to these LLM-based orchestrators do not reflect true model capabilities and performance characteristics, leading to suboptimal model selection, reduced accuracy, and increased energy costs. In this paper, we conduct an empirical analysis of LLM-based orchestration limitations and propose GUIDE, a new energy-aware model selection framework that accounts for performance-energy trade-offs by incorporating quantitative model performance characteristics in decision-making. Experimental results demonstrate that GUIDE increases accuracy by 0.90%-11.92% across various evaluated tasks, and achieves up to 54% energy efficiency improvement, while reducing orchestrator model selection latency from 4.51 s to 7.2 ms.
Similar Papers
Energy-Aware LLMs: A step towards sustainable AI for downstream applications
Performance
Saves energy while making AI smarter.
Context-aware LLM-based AI Agents for Human-centered Energy Management Systems in Smart Buildings
Artificial Intelligence
AI helps buildings save energy by talking to them.
Can AI Make Energy Retrofit Decisions? An Evaluation of Large Language Models
Artificial Intelligence
Helps homes use less energy and save money.