Flow Matching at Scale: A Machine Learning Framework for Efficient Large-Size Sampling of Many-Body Systems
By: Qian-Rui Lee, Daw-Wei Wang
Potential Business Impact:
Computers learn big things from small examples.
We propose a machine learning framework based on Flow Matching to overcome the scaling limitations of Markov Chain Monte Carlo (MCMC) methods. We demonstrate its capability in the 2D XY model, where a single network, trained only on configurations from a small ($32\times 32$) lattice at sparse temperature points, generates reliable samples for a significantly larger system ($128\times 128$) across a continuous temperature range without retraining. The generated configurations show strong agreement with key thermodynamic observables and correctly capture the signatures of the Berezinskii-Kosterlitz-Thouless (BKT) transition. This dual generalization is enabled by the Flow Matching framework, which allows us to learn a continuous, temperature-conditioned mapping. At the same time, the inductive biases of the underlying CNN architecture ensure that the learned local physical rules are scale-invariant. This "train-small, generate-large" capability establishes a new paradigm for efficiently studying critical phenomena, offering a significant computational advantage for exploring the thermodynamic limit. The method can be directly applied to other classical or quantum many-body systems described by continuous fields on a lattice.
Similar Papers
Flow Matching at Scale: A Machine Learning Framework for Efficient Large-Size Sampling of Many-Body Systems
Statistical Mechanics
Teaches computers to copy big things from small examples.
On the flow matching interpretability
Machine Learning (CS)
Makes AI understand how things change physically.
Flow Matching for Probabilistic Learning of Dynamical Systems from Missing or Noisy Data
Machine Learning (CS)
Predicts many possible futures for weather.