Score: 0

Boosted GFlowNets: Improving Exploration via Sequential Learning

Published: November 12, 2025 | arXiv ID: 2511.09677v1

By: Pedro Dall'Antonia , Tiago da Silva , Daniel Augusto de Souza and more

Potential Business Impact:

Finds rare, valuable things by exploring better.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Generative Flow Networks (GFlowNets) are powerful samplers for compositional objects that, by design, sample proportionally to a given non-negative reward. Nonetheless, in practice, they often struggle to explore the reward landscape evenly: trajectories toward easy-to-reach regions dominate training, while hard-to-reach modes receive vanishing or uninformative gradients, leading to poor coverage of high-reward areas. We address this imbalance with Boosted GFlowNets, a method that sequentially trains an ensemble of GFlowNets, each optimizing a residual reward that compensates for the mass already captured by previous models. This residual principle reactivates learning signals in underexplored regions and, under mild assumptions, ensures a monotone non-degradation property: adding boosters cannot worsen the learned distribution and typically improves it. Empirically, Boosted GFlowNets achieve substantially better exploration and sample diversity on multimodal synthetic benchmarks and peptide design tasks, while preserving the stability and simplicity of standard trajectory-balance training.

Page Count
22 pages

Category
Computer Science:
Machine Learning (CS)