On least-squares estimation of the residual variance in the first-order moving average model (original) (raw)

In the ÿrst-order moving average model we analyze the behavior of the estimator of the variance of the random residual coming from the method of least squares. This procedure is incorporated into some widely used computer programs. We show through simulations that the asymptotic formulas for the bias and variance of the maximum likelihood estimator, can be used as approximations for the least-squares estimator, at least when the model parameter is far from the region of non-invertibility. Asymptotic results are developed using the "long autoregression" idea, and this leads to a closed-form expression for the least-squares estimator. In turn this is compared with the maximum likelihood estimator under normality, both in its exact and in an approximated version, which is obtained by approximating the matrix in the exponent of the Gaussian likelihood function. This comparison is illustrated by some numerical examples. The dependency of the results about biases on the values of the model parameter is emphasized.