Rigorous Implications of the Low-Degree Heuristic
By: Jun-Ting Hsieh , Daniel M. Kane , Pravesh K. Kothari and more
Potential Business Impact:
Proves computers can't tell some patterns apart.
Over the past decade, the low-degree heuristic has been used to estimate the algorithmic thresholds for a wide range of average-case planted vs null distinguishing problems. Such results rely on the hypothesis that if the low-degree moments of the planted and null distributions are sufficiently close, then no efficient (noise-tolerant) algorithm can distinguish between them. This hypothesis is appealing due to the simplicity of calculating the low-degree likelihood ratio (LDLR) -- a quantity that measures the similarity between low-degree moments. However, despite sustained interest in the area, it remains unclear whether low-degree indistinguishability actually rules out any interesting class of algorithms. In this work, we initiate the study and develop technical tools for translating LDLR upper bounds to rigorous lower bounds against concrete algorithms. As a consequence, we prove: for any permutation-invariant distribution $\mathsf{P}$, 1. If $\mathsf{P}$ is over $\{0,1\}^n$ and is low-degree indistinguishable from $U = \mathrm{Unif}(\{0,1\}^n)$, then a noisy version of $\mathsf{P}$ is statistically indistinguishable from $U$. 2. If $\mathsf{P}$ is over $\mathbb{R}^n$ and is low-degree indistinguishable from the standard Gaussian ${N}(0, 1)^n$, then no statistic based on symmetric polynomials of degree at most $O(\log n/\log \log n)$ can distinguish between a noisy version of $\mathsf{P}$ from ${N}(0, 1)^n$. 3. If $\mathsf{P}$ is over $\mathbb{R}^{n\times n}$ and is low-degree indistinguishable from ${N}(0,1)^{n\times n}$, then no constant-sized subgraph statistic can distinguish between a noisy version of $\mathsf{P}$ and ${N}(0, 1)^{n\times n}$.
Similar Papers
The Quasi-Polynomial Low-Degree Conjecture is False
Computational Complexity
Proves some math problems are easier than thought.
Algorithmic contiguity from low-degree conjecture and applications in correlated random graphs
Machine Learning (Stat)
Makes it harder for computers to find hidden patterns.
Computational Lower Bounds for Correlated Random Graphs via Algorithmic Contiguity
Machine Learning (Stat)
Makes hard computer problems impossible to solve.