Maxout Polytopes
By: Andrei Balakin , Shelby Cox , Georg Loho and more
Potential Business Impact:
Makes computer brains learn faster and better.
Maxout polytopes are defined by feedforward neural networks with maxout activation function and non-negative weights after the first layer. We characterize the parameter spaces and extremal f-vectors of maxout polytopes for shallow networks, and we study the separating hypersurfaces which arise when a layer is added to the network. We also show that maxout polytopes are cubical for generic networks without bottlenecks.
Similar Papers
On the expressivity of sparse maxout networks
Machine Learning (CS)
Makes computer brains learn more with less data.
Geometry and Optimization of Shallow Polynomial Networks
Machine Learning (CS)
Teaches computers to learn from data patterns.
Feature Learning Beyond the Edge of Stability
Machine Learning (CS)
Makes AI learn better and faster.