Optimal Estimation for General Gaussian Processes
By: Tetsuya Takabatake, Jun Yu, Chen Zhang
Potential Business Impact:
Makes computer predictions more accurate and reliable.
This paper proposes a novel exact maximum likelihood (ML) estimation method for general Gaussian processes, where all parameters are estimated jointly. The exact ML estimator (MLE) is consistent and asymptotically normally distributed. We prove the local asymptotic normality (LAN) property of the sequence of statistical experiments for general Gaussian processes in the sense of Le Cam, thereby enabling optimal estimation and facilitating statistical inference. The results rely solely on the asymptotic behavior of the spectral density near zero, allowing them to be widely applied. The established optimality not only addresses the gap left by Adenstedt(1974), who proposed an efficient but infeasible estimator for the long-run mean $\mu$, but also enables us to evaluate the finite-sample performance of the existing method -- the commonly used plug-in MLE, in which the sample mean is substituted into the likelihood. Our simulation results show that the plug-in MLE performs nearly as well as the exact MLE, alleviating concerns that inefficient estimation of $\mu$ would compromise the efficiency of the remaining parameter estimates.
Similar Papers
Unbiased Estimation of Multi-Way Gravity Models
Econometrics
Fixes math problems for better predictions.
Rates of Convergence of Maximum Smoothed Log-Likelihood Estimators for Semi-Parametric Multivariate Mixtures
Statistics Theory
Makes smart guesses about mixed data more reliable.
Revisiting Penalized Likelihood Estimation for Gaussian Processes
Methodology
Improves computer predictions when data is scarce.