Score: 1

Towards a Unified Analysis of Neural Networks in Nonparametric Instrumental Variable Regression: Optimization and Generalization

Published: November 18, 2025 | arXiv ID: 2511.14710v1

By: Zonghao Chen , Atsushi Nitanda , Arthur Gretton and more

Potential Business Impact:

Teaches computers to learn from past decisions.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

We establish the first global convergence result of neural networks for two stage least squares (2SLS) approach in nonparametric instrumental variable regression (NPIV). This is achieved by adopting a lifted perspective through mean-field Langevin dynamics (MFLD), unlike standard MFLD, however, our setting of 2SLS entails a \emph{bilevel} optimization problem in the space of probability measures. To address this challenge, we leverage the penalty gradient approach recently developed for bilevel optimization which formulates bilevel optimization as a Lagrangian problem. This leads to a novel fully first-order algorithm, termed \texttt{F$^2$BMLD}. Apart from the convergence bound, we further provide a generalization bound, revealing an inherent trade-off in the choice of the Lagrange multiplier between optimization and statistical guarantees. Finally, we empirically validate the effectiveness of the proposed method on an offline reinforcement learning benchmark.

Repos / Data Links

Page Count
68 pages

Category
Statistics:
Machine Learning (Stat)