Score: 0

Entropy Production in Machine Learning Under Fokker-Planck Probability Flow

Published: January 2, 2026 | arXiv ID: 2601.00554v1

By: Lennon Shikhman

Potential Business Impact:

Fixes computer learning when data changes.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Machine learning models deployed in nonstationary environments experience performance degradation due to data drift. While many drift detection heuristics exist, most lack a principled dynamical interpretation and provide limited guidance on how retraining frequency should be balanced against operational cost. In this work, we propose an entropy--based retraining framework grounded in nonequilibrium stochastic dynamics. Modeling deployment--time data drift as probability flow governed by a Fokker--Planck equation, we quantify model--data mismatch using a time--evolving Kullback--Leibler divergence. We show that the time derivative of this mismatch admits an entropy--balance decomposition featuring a nonnegative entropy production term driven by probability currents. This interpretation motivates entropy--triggered retraining as a label--free intervention strategy that responds to accumulated mismatch rather than delayed performance collapse. In a controlled nonstationary classification experiment, entropy--triggered retraining achieves predictive performance comparable to high--frequency retraining while reducing retraining events by an order of magnitude relative to daily and label--based policies.

Country of Origin
🇺🇸 United States

Page Count
9 pages

Category
Computer Science:
Machine Learning (CS)