Flow Matching at Scale: A Machine Learning Framework for Efficient Large-Size Sampling of Many-Body Systems
By: Qian-Rui Lee, Daw-Wei Wang
Potential Business Impact:
Teaches computers to copy big things from small examples.
We propose a machine learning framework based on Flow Matching to overcome the scaling limitations of Markov Chain Monte Carlo (MCMC) methods. We demonstrate its capability in the 2D XY model, where a single network, trained only on configurations from a small ($32\times 32$) lattice at sparse temperature points, generates reliable samples for a significantly larger system ($128\times 128$) across a continuous temperature range without retraining. The generated configurations show strong agreement with key thermodynamic observables and correctly capture the signatures of the Berezinskii-Kosterlitz-Thouless (BKT) transition. This dual generalization is enabled by the Flow Matching framework, which allows us to learn a continuous, temperature-conditioned mapping. At the same time, the inductive biases of the underlying CNN architecture ensure that the learned local physical rules are scale-invariant. This "train-small, generate-large" capability offers a powerful and efficient alternative for studying critical phenomena. The method can be directly applied to other classical or quantum many-body systems described by continuous fields on a lattice. Furthermore, this framework can serve as a powerful proposal generator in a hybrid scheme with MCMC, dramatically accelerating high-precision studies of the thermodynamic limit.
Similar Papers
Flow Matching at Scale: A Machine Learning Framework for Efficient Large-Size Sampling of Many-Body Systems
Statistical Mechanics
Computers learn big things from small examples.
On the flow matching interpretability
Machine Learning (CS)
Makes AI understand how things change physically.
Flow Matching for Probabilistic Learning of Dynamical Systems from Missing or Noisy Data
Machine Learning (CS)
Predicts many possible futures for weather.