On Linear Estimators for some Stable Vectors
By: Rayan Chouity , Charbel Hannoun , Jihad Fahs and more
We consider the estimation problem for jointly stable random variables. Under two specific dependency models: a linear transformation of two independent stable variables and a sub-Gaussian symmetric $α$-stable (S$α$S) vector, we show that the conditional mean estimator is linear in both cases. Moreover, we find dispersion optimal linear estimators. Interestingly, for the sub-Gaussian (S$α$S) vector, both estimators are identical generalizing the well-known Gaussian result of the conditional mean being the best linear minimum-mean square estimator.
Similar Papers
Functional uniqueness and stability of Gaussian priors in optimal L1 estimation
Information Theory
Makes computers guess better with less data.
One-Bit Distributed Mean Estimation with Unknown Variance
Information Theory
Helps computers guess averages with tiny messages.
Uniform inference in linear mixed models
Statistics Theory
Improves math models for tricky data.