Double Descent and Overparameterization in Particle Physics Data
By: Matthias Vigl, Lukas Heinrich
Potential Business Impact:
Makes computer models better at guessing physics results.
Recently, the benefit of heavily overparameterized models has been observed in machine learning tasks: models with enough capacity to easily cross the \emph{interpolation threshold} improve in generalization error compared to the classical bias-variance tradeoff regime. We demonstrate this behavior for the first time in particle physics data and explore when and where `double descent' appears and under which circumstances overparameterization results in a performance gain.
Similar Papers
A dynamic view of some anomalous phenomena in SGD
Optimization and Control
Helps computers learn better by finding hidden patterns.
The Double Descent Behavior in Two Layer Neural Network for Binary Classification
Machine Learning (Stat)
Finds a sweet spot for computer learning accuracy.
Dissecting the Impact of Model Misspecification in Data-driven Optimization
Machine Learning (CS)
Improves computer decisions when learning from imperfect data.