Score: 0

Learning to accelerate Krasnosel'skii-Mann fixed-point iterations with guarantees

Published: January 12, 2026 | arXiv ID: 2601.07665v1

By: Andrea Martin, Giuseppe Belgioioso

Potential Business Impact:

Makes computer math problems solve much faster.

Business Areas:
A/B Testing Data and Analytics

We introduce a principled learning to optimize (L2O) framework for solving fixed-point problems involving general nonexpansive mappings. Our idea is to deliberately inject summable perturbations into a standard Krasnosel'skii-Mann iteration to improve its average-case performance over a specific distribution of problems while retaining its convergence guarantees. Under a metric sub-regularity assumption, we prove that the proposed parametrization includes only iterations that locally achieve linear convergence-up to a vanishing bias term-and that it encompasses all iterations that do so at a sufficiently fast rate. We then demonstrate how our framework can be used to augment several widely-used operator splitting methods to accelerate the solution of structured monotone inclusion problems, and validate our approach on a best approximation problem using an L2O-augmented Douglas-Rachford splitting algorithm.

Country of Origin
🇸🇪 Sweden

Page Count
7 pages

Category
Electrical Engineering and Systems Science:
Systems and Control