Branching Strategies Based on Subgraph GNNs: A Study on Theoretical Promise versus Practical Reality
By: Junru Zhou, Yicheng Wang, Pan Li
Potential Business Impact:
Makes computers solve hard math problems faster.
Graph Neural Networks (GNNs) have emerged as a promising approach for ``learning to branch'' in Mixed-Integer Linear Programming (MILP). While standard Message-Passing GNNs (MPNNs) are efficient, they theoretically lack the expressive power to fully represent MILP structures. Conversely, higher-order GNNs (like 2-FGNNs) are expressive but computationally prohibitive. In this work, we investigate Subgraph GNNs as a theoretical middle ground. Crucially, while previous work [Chen et al., 2025] demonstrated that GNNs with 3-WL expressive power can approximate Strong Branching, we prove a sharper result: node-anchored Subgraph GNNs whose expressive power is strictly lower than 3-WL [Zhang et al., 2023] are sufficient to approximate Strong Branching scores. However, our extensive empirical evaluation on four benchmark datasets reveals a stark contrast between theory and practice. While node-anchored Subgraph GNNs theoretically offer superior branching decisions, their $O(n)$ complexity overhead results in significant memory bottlenecks and slower solving times than MPNNs and heuristics. Our results indicate that for MILP branching, the computational cost of expressive GNNs currently outweighs their gains in decision quality, suggesting that future research must focus on efficiency-preserving expressivity.
Similar Papers
Dynamic Stratified Contrastive Learning with Upstream Augmentation for MILP Branching
Machine Learning (CS)
Teaches computers to solve hard math problems faster.
Using Subgraph GNNs for Node Classification:an Overlooked Potential Approach
Machine Learning (CS)
Makes smart computer networks learn faster and better.
A Distributed Training Architecture For Combinatorial Optimization
Machine Learning (CS)
Solves hard problems on huge networks faster.