Score: 0

Revisiting Training Scale: An Empirical Study of Token Count, Power Consumption, and Parameter Efficiency

Published: January 10, 2026 | arXiv ID: 2601.06649v1

By: Joe Dwyer

Potential Business Impact:

Makes AI training use less power and time.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Research in machine learning has questioned whether increases in training token counts reliably produce proportional performance gains in large language models. Building on prior work introducing an energy-aware parameter efficiency metric, this study empirically examines the effects of increasing training token counts under fixed hardware and training conditions. The significance of this work lies in the explicit integration of power consumption and execution duration, as reflected by the power sampling frequency, into token-scale analysis. This addresses a gap in prior studies emphasizing performance outcomes while underrepresenting computational and energy costs. Using a repeated-measures experimental design on a constant GPU instance with an identical model architecture, optimizer settings, and epoch counts, a 1.1-billion-parameter TinyLlama model was trained at three token counts (500K, 1M, and 2M). While conventional performance metrics exhibited inconsistent or diminishing returns across token scales, the inclusion of power consumption and execution duration revealed a strictly monotonic decline in training efficiency as token count increased. Repeated-measures ANOVA demonstrated a strong effect of token count on parameter efficiency, with all pairwise comparisons remaining significant following Bonferroni correction. These findings indicate that increases in training token counts may be energetically inefficient even when marginal performance improvements are observed, underscoring the importance of efficiency-aware evaluation in large language model training.

Country of Origin
πŸ‡ΊπŸ‡Έ United States

Page Count
6 pages

Category
Computer Science:
Machine Learning (CS)