Sequential robust estimation for nonparametric autoregressive models (original) (raw)
Related papers
Sequential robust efficient estimation for nonparametric autoregressive models
We construct efficient robust truncated sequential estimators for the pointwise estimation problem in nonparametric autoregression models with smooth coefficients. For Gaussian models we propose an adaptive procedure based on the constructed sequential estimators. The minimax nonadaptive and adaptive convergence rates are established. It turns out that in this case these rates are the same as for regression models.
Sequential robust efficient estimation for
2016
We construct efficient robust truncated sequential estimators for the pointwise estimation problem in nonparametric autoregression models with smooth coefficients. For Gaussian models we propose an adaptive procedure based on the constructed sequential estimators. The minimax nonadaptive and adaptive convergence rates are established. It turns out that in this case these rates are the same as for regression models.
Sequential Adaptive Estimators in Nonparametric Autoregressive Models
Sequential Analysis, 2011
We construct a sequential adaptive procedure for estimating the autoregressive function at a given point in nonparametric autoregression models with Gaussian noise. We make use of the sequential kernel estimators. The optimal adaptive convergence rate is given as well as the upper bound for the minimax risk.
Adaptive efficient robust estimation for nonparametric autoregressive models
2019
In this paper for the first time the adaptive efficient estimation problem for nonparametric autoregressive models has been studied. First of all, through the Van Trees inequality the sharp bound for the robust quadratic risks, i.e. the Pinsker constant (see, for example , in [19]), in explicit form has been obtained. Then, through the sharp oracle inequalities method developed in [4] for non parametric autoregressions an adaptive efficient model selection procedure is proposed , i.e. such for which the upper bound of its robust quadratic risk coincides with the obtained Pinsker constant. MSC: primary 62G08, secondary 62G0
Adaptive estimators in nonparametric autoregressive models
2009
This paper deals with the estimation of a autoregression function at a given point in nonparametric autoregression models with Gaussian noise. An adaptive kernel estimator which attains the minimax rate is constructed for the minimax risk.
ST ] 6 S ep 2 01 8 Sequential Model Selection Method for Nonparametric Autoregression ∗
2018
In this paper for the first time the nonparametric autoregression estimation problem for the quadratic risks is considered. To this end we develop a new adaptive sequential model selection method based on the efficient sequential kernel estimators proposed by Arkoun and Pergamenshchikov (2016). Moreover, we develop a new analytical tool for general regression models to obtain the non asymptotic sharp oracle inequalities for both usual quadratic and robust quadratic risks. Then, we show that the constructed sequential model selection procedure is optimal in the sense of oracle inequalities. MSC: primary 62G08, secondary 62G05
A robust nonparametric estimation of the autoregression function under an ergodic hypothesis
Canadian Journal of Statistics, 2000
The authors propose a family of robust nonparametric estimators for regression or autoregression functions based on kernel methods. They show the strong uniform consistency of these estimators under a general ergodicity condition, when the data are unbounded and range over suitably increasing sequences of compact sets. They give some implications of these results for stating the prediction in Markovian processes with finite order and show, through simulation, the efficiency of the predictors they propose. RÉSUMÉ Les auteurs proposent une famille d'estimateurs robustes non paramétriques de la fonction de régression ou d'autorégression basée sur la méthode du noyau. Ilsétablissent la convergence uniforme de cette famille d'estimateurs sous une hypothèse générale d'ergodicité, dans le cas où les observations ne sont pas bornées et appartiennentà une suite croissante de compacts. Ils appliquent leurs résultatsà la prévision des processus markoviens d'ordre fini et montrent, par simulation, l'efficacité des prédicteurs proposés.
Nonparametric estimation for an autoregressive model
The paper deals with the nonparametric estimation problem at a given fixed point for an autoregressive model with unknown distributed noise. Kernel estimate modifications are proposed. Asymptotic minimax and efficiency properties for proposed estimators are shown.