Convergence Rates & Asymptotic Normality Estimators (original) (raw)

Monotone B-Spline Estimators of the Conditional Mean Function

2020

In this research, regression and smoothing spline approximations are used and compared for estimating the conditional mean function. Major attention is directed towards shape constrained estimation. In many applications monotonicity is an integrated part of the regression functions g(·) being fitted. Monotonicity is obtained here free of charge in the sense that the constrained fits inherit the asymptotic properties of the unconstrained estimates. The main tool is the use of quadratic B-splines. Some simulation experiments have been undertaken to evaluate finite-sample performance of the presented monotone ‘regression’ and ‘smoothing’ spline estimators ?̂?r ∗ and ?̂?s . The monotone estimator ?̂?rearr ∗ obtained by applying the modern rearrangement technique is used as a benchmark in various constrained (linear monotone, monotone concave and only monotone) scenarios, for different sample sizes. The resulting Mean Squared Error estimates indicate that ?̂?rearr ∗ is the winner only wh...

Convergence Rates for Series Estimators

This paper consists of part of one originally titled "Consistency and Asymptotic Normality of Nonparametric Projection Estimators." Helpful comments were provided by Andreas Buja and financial support by the NSF and the Sloan Foundation.

Estimation of the Functional Linear Regression with Smoothing Splines

Contributions to Statistics, 2008

We consider functional linear regression where a real variable Y depends on a functional variable X. The functional coefficient of the model is estimated by means of smoothing splines. We derive the rates of convergence with respect to the semi-norm induced by the covariance operator of X, which comes to evaluate the error of prediction. These rates, which essentially depend on the smoothness of the function parameter and on the structure of the predictor, are shown to be optimal over a large class of functions parameters and distributions of the predictor.

A ug 2 01 4 On Asymptotic Normality of the Local Polynomial Regression Estimator with Stochastic Bandwidths 1

2014

Abstract. Nonparametric density and regression estimators commonly depend on a bandwidth. The asymptotic properties of these estimators have been widely studied when bandwidths are nonstochastic. In practice, however, in order to improve finite sample performance of these estimators, bandwidths are selected by data driven methods, such as cross-validation or plug-in procedures. As a result nonparametric estimators are usually constructed using stochastic bandwidths. In this paper we establish the asymptotic equivalence in probability of local polynomial regression estimators under stochastic and nonstochastic bandwidths. Our result extends previous work by Boente and Fraiman (1995) and Ziegler (2004).

On the asymptotic normality of an estimate of a regression functional

J. Mach. Learn. Res., 2015

An estimate of the second moment of the regression function is introduced. Its asymptotic normality is proved such that the asymptotic variance depends neither on the dimension of the observation vector, nor on the smoothness properties of the regression function. The asymptotic variance is given explicitly.

Smoothing splines estimators for functional linear regression

The Annals of Statistics, 2009

The paper considers functional linear regression, where scalar responses Y1, . . . , Yn are modeled in dependence of random functions X1, . . . , Xn. We propose a smoothing splines estimator for the functional slope parameter based on a slight modification of the usual penalty. Theoretical analysis concentrates on the error in an out-ofsample prediction of the response for a new random function Xn+1. It is shown that rates of convergence of the prediction error depend on the smoothness of the slope function and on the structure of the predictors. We then prove that these rates are optimal in the sense that they are minimax over large classes of possible slope functions and distributions of the predictive curves. For the case of models with errors-in-variables the smoothing spline estimator is modified by using a denoising correction of the covariance matrix of discretized curves. The methodology is then applied to a real case study where the aim is to predict the maximum of the concentration of ozone by using the curve of this concentration measured the preceding day.

Strong uniform consistency and asymptotic normality of a kernel based error density estimator in functional autoregressive models

Statistical Inference for Stochastic Processes, 2012

Estimating the innovation probability density is an important issue in any regression analysis. This paper focuses on functional autoregressive models. A residual-based kernel estimator is proposed for the innovation density. Asymptotic properties of this estimator depend on the average prediction error of the functional autoregressive function. Sufficient conditions are studied to provide strong uniform consistency and asymptotic normality of the kernel density estimator.