Hard Thresholding Pursuit Algorithms for Least Absolute Deviations Problem
By: Jiao Xu, Peng Li, Bing Zheng
Least absolute deviations (LAD) is a statistical optimality criterion widely utilized in scenarios where a minority of measurements are contaminated by outliers of arbitrary magnitudes. In this paper, we delve into the robustness of the variant of adaptive iterative hard thresholding to outliers, known as graded fast hard thresholding pursuit (GFHTP$_1$) algorithm. Unlike the majority of the state-of-the-art algorithms in this field, GFHTP$_1$ does not require prior information about the signal's sparsity. Moreover, its design is parameterless, which not only simplifies the implementation process but also removes the intricacies of parameter optimization. Numerical experiments reveal that the GFHTP$_1$ algorithm consistently outperforms competing algorithms in terms of both robustness and computational efficiency.
Similar Papers
Rethinking Hard Thresholding Pursuit: Full Adaptation and Sharp Estimation
Statistics Theory
Finds hidden patterns in messy data better.
Adaptive Iterative Soft-Thresholding Algorithm with the Median Absolute Deviation
Machine Learning (Stat)
Makes computer math problems solve themselves.
Adaptive Heavy-Tailed Stochastic Gradient Descent
Machine Learning (CS)
Helps AI learn better and faster.