PowerTrip: Exploiting Federated Heterogeneous Datacenter Power for Distributed ML Training
By: Talha Mehboob , Luanzheng Guo , Nathan Tallent and more
Potential Business Impact:
Trains big computer brains faster using less power.
The exponential growth of large-scale AI models has led to computational and power demands that can exceed the capacity of a single data center. This is due to the limited power supplied by regional grids that leads to limited regional computational power. Consequently, distributing training workloads across geographically distributed sites has become essential. However, this approach introduces a significant challenge in the form of communication overhead, creating a fundamental trade-off between the performance gains from accessing greater aggregate power and the performance losses from increased network latency. Although prior work has focused on reducing communication volume or using heuristics for distribution, these methods assume constant homogeneous power supplies and ignore the challenge of heterogeneous power availability between sites. To address the challenge of training large models in power-constrained, geo-distributed environments, we introduce PowerTrip, a system that dynamically selects a subset of sites during runtime to optimize the power-communication trade-off. Specifically, PowerTrip selects sites based on a power-to-cost heuristic, prioritizing those with high power availability and low network latency. PowerTrip employs a dynamic greedy approach and uses the marginal gain in training efficiency, i.e., accuracy improvement per unit of time, to optimize for the number of sites where the performance penalty from network overhead negates the benefit of adding more computational power. Our evaluation, which uses real-world Google power traces to model realistic power capacity constraints, demonstrates that PowerTrip can reduce time-to-accuracy by up to 50% compared to existing baseline policies.
Similar Papers
Green Distributed AI Training: Orchestrating Compute Across Renewable-Powered Micro Datacenters
Networking and Internet Architecture
Moves computer work to clean energy when available.
Power Stabilization for AI Training Datacenters
Hardware Architecture
Smooths out big computer power spikes.
Power Stabilization for AI Training Datacenters
Hardware Architecture
Keeps AI training from breaking power grids.