Score: 1

Neural Thermodynamic Laws for Large Language Model Training

Published: May 15, 2025 | arXiv ID: 2505.10559v1

By: Ziming Liu , Yizhou Liu , Jeff Gore and more

BigTech Affiliations: Massachusetts Institute of Technology

Potential Business Impact:

Makes computer learning faster and more efficient.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Beyond neural scaling laws, little is known about the laws underlying large language models (LLMs). We introduce Neural Thermodynamic Laws (NTL) -- a new framework that offers fresh insights into LLM training dynamics. On the theoretical side, we demonstrate that key thermodynamic quantities (e.g., temperature, entropy, heat capacity, thermal conduction) and classical thermodynamic principles (e.g., the three laws of thermodynamics and the equipartition theorem) naturally emerge under river-valley loss landscape assumptions. On the practical side, this scientific perspective yields intuitive guidelines for designing learning rate schedules.

Country of Origin
🇺🇸 United States

Page Count
18 pages

Category
Computer Science:
Machine Learning (CS)