On least-squares estimation of the residual variance in the first-order moving average model (original) (raw)
Related papers
2020
When modeling time series data using autoregressive-moving average processes, it is a common practice to presume that the residuals are normally distributed. However, sometimes we encounter non-normal residuals and asymmetry of data marginal distribution. Despite widespread use of pure autoregressive processes for modeling non-normal time series, the autoregressive-moving average models have less been used. The main reason is the difficulty in estimating the autoregressive-moving average model parameters. The purpose of this study is to address this intricacy by approximating maximum likelihood estimators, which is particularly important from model selection perspective. Accordingly, the coefficients and residual distribution parameters of the first-order stationary autoregressive-moving average model with residuals that follow exponential and Weibull families, were estimated. Then based on the simulation study, the obtained theoretical results were investigated and it was shown that the modified maximum likelihood estimators were suitable estimators to estimate the first-order autoregressive-moving average model parameters in nonnormal mode. In a numerical example positive skewness of obtained residuals from fitting the first-order autoregressive-moving average model was shown. Following that, the parameters of candidate residual distributions estimated by modified maximum likelihood estimators and one of the estimated models for modeling the data was selected.
On Residual Variance Estimation in Autoregressive Models
Journal of Time Series Analysis, 1998
In this paper we consider time series models belonging to the autoregressive (AR) family and deal with the estimation of the residual variance. This is important because estimates of the variance are involved in, for example, con®dence sets for the parameters of the model, estimation of the spectrum, expressions for the estimated error of prediction and sample quantities used to make inferences about the order of the model. We consider the asymptotic biases for moment and least squares estimators of the residual variance, and compare them with known results when available and with those for maximum likelihood estimators under normality. Simulation results are presented for ®nite samples.
Least squares estimation of the linear model with autoregressive errors
1992
A Monte Carlo study of the least squares estimator of the regression model with autocorrelated errors is presented. The model contains a stationary explanatory variable and a random walk explanatory variable. The error model is a rst order autoregressive model and the unit root case is included in the simulations. The limiting distribution of the regression pivotals for the basic model are normal, while the statistics for the autoregressive coecient h a v e a distribution that depends on the true parameter. The agreement b e t w een the Monte Carlo results and the asymptotic theory depends upon the autoregressive coecient and on the nature of the explanatory variable.
The second-order bias and mean squared error of estimators in time-series models
Journal of Econometrics, 2007
We develop the analytical results on the second-order bias and mean squared error (MSE) of estimators in time series. These results provide a unified approach to developing the properties of a large class of estimators in the linear and nonlinear time series models and they are valid for both the normal and non-normal sample of observations, and where the regressors are stochastic. The estimators included are the generalized method of moments, maximum likelihood, least squares, and other extremum estimators. Our general results are applied to a wide variety of econometric models. Numerical results for some of these models are presented. expectation of the Taylor series expansion and then solved for E ³β − β´.
International Journal of Advanced Statistics and Probability, 2013
This paper compares a Least-Squared Random Coefficient Autoregressive (RCA) model with a Least-Squared RCA model based on Autocorrelated Errors (RCA-AR). We looked at only the first order models, denoted RCA(1) and RCA(1)-AR(1). The efficiency of the Least-Squared method was checked by applying the models to Brownian motion and Wiener process, and the efficiency followed closely the asymptotic properties of a normal distribution. In a simulation study, we compared the performance of RCA(1) and RCA(1)-AR(1) by using the Mean Square Errors (MSE) as a criterion. The RCA(1) exhibited good power estimation in both cases where the data is stationary and nonstationary. On the other hand, when data oscillates around its mean, RCA(1)-AR(1) performed better. For real world data, we applied the two models to the daily volume of the Thai gold price and found that RCA(1)-AR(1) performed better than RCA(1).
Improved estimation of the linear regression model with autocorrelated errors
1990
For estimating the coefficient vector of a linear regression model with first order autoregressive disturbances, a family of Stein-rule estimators based on the two-step Prais-Winsten estimating procedure is considered and an Edgeworth type asymptotic expansion for its distribution is derived. The performance of this family of estimators relative to the two-step Prais-Winsten estimator is also derived under a squared error loss function.