Sensor network localization has a benign landscape after low-dimensional relaxation
By: Christopher Criscitiello , Andrew D. McRae , Quentin Rebjock and more
Potential Business Impact:
Find hidden locations from distance clues.
We consider the sensor network localization problem, also known as multidimensional scaling or Euclidean distance matrix completion. Given a ground truth configuration of $n$ points in $\mathbb{R}^\ell$, we observe a subset of the pairwise distances and aim to recover the underlying configuration (up to rigid transformations). We show with a simple counterexample that the associated optimization problem is nonconvex and may admit spurious local minimizers, even when all distances are known. Yet, inspired by numerical experiments, we argue that all second-order critical points become global minimizers when the problem is relaxed by optimizing over configurations in dimension $k > \ell$. Specifically, we show this for two settings, both when all pairwise distances are known: (1) for arbitrary ground truth points, and $k= O(\sqrt{\ell n})$, and: (2) for isotropic random ground truth points, and $k = O(\ell + \log n)$. To prove these results, we identify and exploit key properties of the linear map which sends inner products to squared distances.
Similar Papers
Reducing Sensor Requirements by Relaxing the Network Metric Dimension
Discrete Mathematics
Find sources of problems with fewer sensors.
Deep Learning and Matrix Completion-aided IoT Network Localization in the Outlier Scenarios
Machine Learning (CS)
Finds lost devices even with bad signals.
Provable Non-Convex Euclidean Distance Matrix Completion: Geometry, Reconstruction, and Robustness
Optimization and Control
Finds object shapes from partial distance clues.