Score: 0

Energy-Efficient Dynamic Training and Inference for GNN-Based Network Modeling

Published: March 24, 2025 | arXiv ID: 2503.18706v1

By: Chetna Singhal, Yassine Hadjadj-Aoul

Potential Business Impact:

Saves energy by smarter computer networks.

Business Areas:
Power Grid Energy

Efficient network modeling is essential for resource optimization and network planning in next-generation large-scale complex networks. Traditional approaches, such as queuing theory-based modeling and packet-based simulators, can be inefficient due to the assumption made and the computational expense, respectively. To address these challenges, we propose an innovative energy-efficient dynamic orchestration of Graph Neural Networks (GNN) based model training and inference framework for context-aware network modeling and predictions. We have developed a low-complexity solution framework, QAG, that is a Quantum approximation optimization (QAO) algorithm for Adaptive orchestration of GNN-based network modeling. We leverage the tripartite graph model to represent a multi-application system with many compute nodes. Thereafter, we apply the constrained graph-cutting using QAO to find the feasible energy-efficient configurations of the GNN-based model and deploying them on the available compute nodes to meet the network modeling application requirements. The proposed QAG scheme closely matches the optimum and offers atleast a 50% energy saving while meeting the application requirements with 60% lower churn-rate.

Page Count
7 pages

Category
Computer Science:
Networking and Internet Architecture