Universal Architectures for the Learning of Polyhedral Norms and Convex Regularizers
By: Michael Unser, Stanislas Ducotterd
Potential Business Impact:
Cleans up blurry pictures better than old ways.
This paper addresses the task of learning convex regularizers to guide the reconstruction of images from limited data. By imposing that the reconstruction be amplitude-equivariant, we narrow down the class of admissible functionals to those that can be expressed as a power of a seminorm. We then show that such functionals can be approximated to arbitrary precision with the help of polyhedral norms. In particular, we identify two dual parameterizations of such systems: (i) a synthesis form with an $\ell_1$-penalty that involves some learnable dictionary; and (ii) an analysis form with an $\ell_\infty$-penalty that involves a trainable regularization operator. After having provided geometric insights and proved that the two forms are universal, we propose an implementation that relies on a specific architecture (tight frame with a weighted $\ell_1$ penalty) that is easy to train. We illustrate its use for denoising and the reconstruction of biomedical images. We find that the proposed framework outperforms the sparsity-based methods of compressed sensing, while it offers essentially the same convergence and robustness guarantees.
Similar Papers
From Graphical Lasso to Atomic Norms: High-Dimensional Pattern Recovery
Statistics Theory
Finds hidden patterns in complex data.
A Graphical Global Optimization Framework for Parameter Estimation of Statistical Models with Nonconvex Regularization Functions
Optimization and Control
Solves hard math puzzles for computers faster.
Neural Network Enhanced Polyconvexification of Isotropic Energy Densities in Computational Mechanics
Numerical Analysis
Makes engineering computer models work much faster.