Closed-form results for vector moving average models with a univariate estimation approach (original) (raw)
Related papers
Vector Autoregressive Moving Average Model with Scalar Moving Average
arXiv (Cornell University), 2019
We show Vector Autoregressive Moving Average models with scalar Moving Average components could be estimated by generalized least square (GLS) for each fixed moving average polynomial. The conditional variance of the GLS model is the concentrated covariant matrix of the moving average process. Under GLS the likelihood function of these models has similar format to their VAR counterparts. Maximum likelihood estimate can be done by optimizing with gradient over the moving average parameters. These models are inexpensive generalizations of Vector Autoregressive models. We discuss a relationship between this result and the Borodin-Okounkov formula in operator theory.
Estimation of parameters using an updated vector autoregressive model
International Journal of Contemporary Mathematical Sciences
Many statistical models, be it deterministic or stochastic, usually contain a number of parameters that make up the model(s). Ordinarily, the maximum likelihood estimation (MLE) and least squares estimation (LSE) methods are the most applied methods of estimation. However, in the two approaches, the main focus is on the estimation of the parameters since parameter estimation is a key step that cannot be avoided as far as modelling or model building is concerned. In this paper, parameter estimation in the updated vector autoregressive model is shown. We consider estimation of the parameters by use of the dual estimation approach, precisely using joint estimation which can estimate both the state and the parameters, applied to some VAR models in one dimension and in two dimension. From the results, it is observed that there is convergence of the parameters to the true parameter values as time evolves.
The output of a causal, stable, time-invariant nonlinear filter can be approximately represented by the linear and quadratic terms of a finite parameter Volterra series expansion. We call this representation the "quadratic nonlinear MA model" since it is the logical extension of the usual linear MA process. Where the actual generating mechanism for the data is fairly smooth, this quadratic MA model should provide a better approximation to the true dynamics than the two- state threshold autoregression and Markov switching models usually considered. As with linear MA processes, the nonlinear MA model coefficients can be estimated via least squares fitting, but it is essential to begin with a reasonably parsimonious model identification and non-arbitrary preliminary estimates for the parameters. In linear ARMA modeling these are derived from the sample correlogram and the sample partial correlogram, but these tools are confounded by nonlinearity in the generating mechanism. H...
Vector Autoregresive Moving Average Identification for Macroeconomic Modeling: Algorithms and Theory
2009
This paper develops a new methodology for identifying the structure of VARMA time series models. The analysis proceeds by examining the echelon canonical form and presents a fully automatic data driven approach to model specification using a new technique to determine the Kronecker invariants. A novel feature of the inferential procedures developed here is that they work in terms of a canonical scalar ARMAX representation in which the exogenous regressors are given by predetermined contemporaneous and lagged values of other variables in the VARMA system. This feature facilitates the construction of algorithms which, from the perspective of macroeconomic modeling, are efficacious in that they do not use AR approximations at any stage. Algorithms that are applicable to both asymptotically stationary and unit-root, partially nonstationary (cointegrated) time series models are presented. A sequence of lemmas and theorems show that the algorithms are based on calculations that yield stro...
Empirical Vector Autoregressive Modeling
Lecture Notes in Economics and Mathematical Systems, 1994
9 2 The Unrestricted VAR and its components 11 2.1 Introduetion 11 2.2 The model 12 2.3 Univariate processes and unit roots 15 2.4 Integrated processes 18 2.4.1 Definitions and notation 18 2.4.2 MA representation, autocorrelation and pseudo spectrum 2.5 Alternative models for nonstationarity, long memory and persistenee 2.5.1 Nonstationarity 2.5.2 Long memory, the varianee time function and adjusted range analysis 2.5.3 Persistenee Appendix A2.1 MA representation integrated process A2.1.1 MA representations A2.
AN ALGORITHM FOR THE EXACT LIKELIHOOD OF A STATIONARY VECTOR AUTOREGRESSIVE-MOVING AVERAGE MODEL
The so-called innovations form of the likelihood function implied by a stationary vector autoregressive-moving average model is considered without directly using a state–space representation. Specifically, it is shown in detail how to compute the exact likelihood by an adaptation to the multivariate case of the innovations algorithm of Ansley (1979) for univariate models. Comparisons with other existing methods are also provided, showing that the algorithm described here is computationally more efficient than the fastest methods currently available in many cases of practical interest.
Identification of autoregressive moving-average parameters of time series
IEEE Transactions on Automatic Control, 1975
A procedure for sequentially estimating the parameters and orders of mixed autoregressive moving-average signal models from time-series data is presented. Identification is performed by first identifying a purely autoregressive signal model. The parameters and orders of the mixed autoregressive moving-average process are then given from the solution of simple algebraic equations involving the purely autoregressive model parameters.