Score: 1

Geometric Algorithms for Neural Combinatorial Optimization with Constraints

Published: October 28, 2025 | arXiv ID: 2510.24039v1

By: Nikolaos Karalias , Akbar Rafiey , Yifei Xu and more

BigTech Affiliations: Massachusetts Institute of Technology

Potential Business Impact:

Teaches computers to solve tricky puzzles.

Business Areas:
Personalization Commerce and Shopping

Self-Supervised Learning (SSL) for Combinatorial Optimization (CO) is an emerging paradigm for solving combinatorial problems using neural networks. In this paper, we address a central challenge of SSL for CO: solving problems with discrete constraints. We design an end-to-end differentiable framework that enables us to solve discrete constrained optimization problems with neural networks. Concretely, we leverage algorithmic techniques from the literature on convex geometry and Carath\'eodory's theorem to decompose neural network outputs into convex combinations of polytope corners that correspond to feasible sets. This decomposition-based approach enables self-supervised training but also ensures efficient quality-preserving rounding of the neural net output into feasible solutions. Extensive experiments in cardinality-constrained optimization show that our approach can consistently outperform neural baselines. We further provide worked-out examples of how our method can be applied beyond cardinality-constrained problems to a diverse set of combinatorial optimization tasks, including finding independent sets in graphs, and solving matroid-constrained problems.

Country of Origin
🇺🇸 United States

Page Count
50 pages

Category
Computer Science:
Machine Learning (CS)