DDEQs: Distributional Deep Equilibrium Models through Wasserstein Gradient Flows
By: Jonathan Geuter , Clément Bonet , Anna Korba and more
Potential Business Impact:
Helps computers understand shapes and groups of dots.
Deep Equilibrium Models (DEQs) are a class of implicit neural networks that solve for a fixed point of a neural network in their forward pass. Traditionally, DEQs take sequences as inputs, but have since been applied to a variety of data. In this work, we present Distributional Deep Equilibrium Models (DDEQs), extending DEQs to discrete measure inputs, such as sets or point clouds. We provide a theoretically grounded framework for DDEQs. Leveraging Wasserstein gradient flows, we show how the forward pass of the DEQ can be adapted to find fixed points of discrete measures under permutation-invariance, and derive adequate network architectures for DDEQs. In experiments, we show that they can compete with state-of-the-art models in tasks such as point cloud classification and point cloud completion, while being significantly more parameter-efficient.
Similar Papers
Gradient flow for deep equilibrium single-index models
Machine Learning (CS)
Makes super-deep computer brains learn faster.
Reversible Deep Equilibrium Models
Machine Learning (CS)
Makes AI learn better with fewer steps.
Equivariant Deep Equilibrium Models for Imaging Inverse Problems
Image and Video Processing
Trains AI to fix images without perfect examples.