Neural Thermodynamic Laws for Large Language Model Training
By: Ziming Liu , Yizhou Liu , Jeff Gore and more
Potential Business Impact:
Makes computer learning faster and more efficient.
Beyond neural scaling laws, little is known about the laws underlying large language models (LLMs). We introduce Neural Thermodynamic Laws (NTL) -- a new framework that offers fresh insights into LLM training dynamics. On the theoretical side, we demonstrate that key thermodynamic quantities (e.g., temperature, entropy, heat capacity, thermal conduction) and classical thermodynamic principles (e.g., the three laws of thermodynamics and the equipartition theorem) naturally emerge under river-valley loss landscape assumptions. On the practical side, this scientific perspective yields intuitive guidelines for designing learning rate schedules.
Similar Papers
Large Language Model Scaling Laws for Neural Quantum States in Quantum Chemistry
Machine Learning (CS)
Makes quantum computers learn faster and better.
Phase Transitions in Large Language Models and the $O(N)$ Model
Machine Learning (CS)
Finds new abilities in AI as it grows.
Neural Thermodynamics I: Entropic Forces in Deep and Universal Representation Learning
Machine Learning (CS)
Explains how AI learns by using "entropic forces."