Towards Scaling Laws for Symbolic Regression
By: David Otte, Jörg K. H. Franke, Frank Hutter
Potential Business Impact:
Finds math rules hidden in data.
Symbolic regression (SR) aims to discover the underlying mathematical expressions that explain observed data. This holds promise for both gaining scientific insight and for producing inherently interpretable and generalizable models for tabular data. In this work we focus on the basics of SR. Deep learning-based SR has recently become competitive with genetic programming approaches, but the role of scale has remained largely unexplored. Inspired by scaling laws in language modeling, we present the first systematic investigation of scaling in SR, using a scalable end-to-end transformer pipeline and carefully generated training data. Across five different model sizes and spanning three orders of magnitude in compute, we find that both validation loss and solved rate follow clear power-law trends with compute. We further identify compute-optimal hyperparameter scaling: optimal batch size and learning rate grow with model size, and a token-to-parameter ratio of $\approx$15 is optimal in our regime, with a slight upward trend as compute increases. These results demonstrate that SR performance is largely predictable from compute and offer important insights for training the next generation of SR models.
Similar Papers
Decomposable Neuro Symbolic Regression
Machine Learning (CS)
Finds simple math rules for complex data.
Current Challenges of Symbolic Regression: Optimization, Selection, Model Simplification, and Benchmarking
Neural and Evolutionary Computing
Finds simpler math rules for better predictions.
Towards symbolic regression for interpretable clinical decision scores
Machine Learning (CS)
Creates easy-to-understand doctor rules from patient data.