Score: 0

MAP Estimation with Denoisers: Convergence Rates and Guarantees

Published: July 21, 2025 | arXiv ID: 2507.15397v2

By: Scott Pesme , Giacomo Meanti , Michael Arbel and more

Potential Business Impact:

Cleans up messy data to solve hard problems.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Denoiser models have become powerful tools for inverse problems, enabling the use of pretrained networks to approximate the score of a smoothed prior distribution. These models are often used in heuristic iterative schemes aimed at solving Maximum a Posteriori (MAP) optimisation problems, where the proximal operator of the negative log-prior plays a central role. In practice, this operator is intractable, and practitioners plug in a pretrained denoiser as a surrogate-despite the lack of general theoretical justification for this substitution. In this work, we show that a simple algorithm, closely related to several used in practice, provably converges to the proximal operator under a log-concavity assumption on the prior $p$. We show that this algorithm can be interpreted as a gradient descent on smoothed proximal objectives. Our analysis thus provides a theoretical foundation for a class of empirically successful but previously heuristic methods.

Page Count
30 pages

Category
Computer Science:
Machine Learning (CS)