Score-Based VAMP with Fisher-Information-Based Onsager Correction
By: Tadashi Wadayama, Takumi Takahashi
Potential Business Impact:
Helps computers solve hard problems without knowing all the rules.
We propose score-based VAMP (SC-VAMP), a variant of vector approximate message passing (VAMP) in which the Onsager correction is expressed and computed via conditional Fisher information, thereby enabling a Jacobian-free implementation. Using learned score functions, SC-VAMP constructs nonlinear MMSE estimators through Tweedie's formula and derives the corresponding Onsager terms from the score-norm statistics, avoiding the need for analytical derivatives of the prior or likelihood. When combined with random orthogonal/unitary mixing to mitigate non-ideal, structured or correlated sensing settings, the proposed framework extends VAMP to complex black-box inference problems where explicit modeling is intractable. Finally, by leveraging the entropic CLT, we provide an information-theoretic perspective on the Gaussian approximation underlying SE, offering insight into the decoupling principle beyond idealized i.i.d. settings, including nonlinear regimes.
Similar Papers
Information-theoretic limits and approximate message-passing for high-dimensional time series
Information Theory
Find hidden patterns in complex changing data.
Information Gradient for Directed Acyclic Graphs: A Score-based Framework for End-to-End Mutual Information Maximization
Information Theory
Helps computers learn to send and get information better.
Information Gradient for Nonlinear Gaussian Channel with Applications to Task-Oriented Communication
Information Theory
Improves how machines learn from noisy signals.