Asymptotics for the least trimmed squares estimator (original) (raw)
Related papers
New algorithms for computing the least trimmed squares estimator
2022
Instead of minimizing the sum of all n squared residuals as the classical least squares (LS) does, Rousseeuw (1984) proposed to minimize the sum of h (n/2 ≤ h < n) smallest squared residuals, the resulting estimator is called least trimmed squares (LTS). The idea of the LTS is simple but its computation is challenging since no LS-type analytical computation formula exists anymore. Attempts had been made since its presence, the feasible solution algorithm (Hawkins (1994)), fastlts.f (Rousseeuw and Van Driessen (1999)), and FAST-LTS (Rousseeuw and Van Driessen (2006)), among others, are promising approximate algorithms. The latter two have been incorporated into R function ltsReg by Valentin TODOROV. These algorithms utilize combinatorial-or subsamplingapproaches. With the great software accessibility and fast speed, the LTS, enjoying many desired properties, has become one of the most popular robust regression estimators across multiple disciplines. This article proposes analytic approaches-employing first order derivative (gradient) and second order derivative (hessian matrix) of the objective function. Our approximate algorithms for the LTS are vetted in synthetic and real data examples. Compared with ltsReg-the benchmark in robust regression and well-known for its speed, our algorithms are comparable (and sometimes even favorable) with respect to both speed and accuracy criteria. Other major contributions include (i) originating the uniqueness and the strong and Fisher consistency at empirical and population settings respectively; (ii) deriving the influence function in a general setting; (iii) re-establishing the asymptotic normality (consequently root-n consistency) of the estimator with a neat and general approach.
On the Least Trimmed Squares Estimator
Algorithmica, 2014
Page 1. On the Least Trimmed Squares Estimator David M. Mount∗ Nathan S. Netanyahu Christine D. Piatko Ruth Silverman§ Angela Y. Wu¶ November 27, 2007 Abstract The linear least trimmed squares (LTS) estimator ...
The multivariate least trimmed squares estimator
DTEW Research Report 0224, 2002
In this paper we introduce the least trimmed squares estimator for multivariate regression. We give three equivalent formulations of the estimator and obtain its breakdown point. A fast algorithm for its computation is proposed. We prove Fisherconsistency at the multivariate regression model with elliptically symmetric error distribution and derive the influence function. Simulations investigate the finite-sample efficiency and robustness of the estimator. To increase the efficiency of the estimator, we also consider a one-step reweighted version, as well as multivariate generalizations of one-step GM-estimators.
Combining Some Biased Estimation Methods with Least Trimmed Squares Regression and its Application
Revista Colombiana de Estadística, 2015
In the case of multicollinearity and outliers in regression analysis, the researchers are encouraged to deal with two problems simultaneously. Biased methods based on robust estimators are useful for estimating the regression coefficients for such cases. In this study we examine some robust biased estimators on the datasets with outliers in x direction and outliers in both x and y direction from literature by means of the R package ltsbase. Instead of a complete data analysis, robust biased estimators are evaluated using capabilities and features of this package.
Asymptotic normality of the least sum of squares of trimmed residuals estimator
2022
To enhance the robustness of the classic least sum of squares (LS) of residuals estimator, Zuo (2022) introduced the least sum of squares of trimmed (LST) residuals estimator. The LST enjoys many desired properties and serves well as a robust alternative to the LS. Its asymptotic properties, including strong and root-n consistency, have been established whereas the asymptotic normality is left unaddressed. This article solves this remained problem.
Least sum of squares of trimmed residuals regression
2022
In the famous least sum of trimmed squares (LTS) of residuals estimator (Rousseeuw (1984)), residuals are first squared and then trimmed. In this article, we first trim residuals using a depth trimming scheme and then square the rest of residuals. The estimator that can minimize the sum of squares of the trimmed residuals, is called an LST estimator. It turns out that LST is also a robust alternative to the classic least sum of squares (LS) of residuals estimator. Indeed, it has a very high finite sample breakdown point, and can resist, asymptotically, up to 50% contamination without breakdown in sharp contrast to the 0% of the LS estimator. The population version of LST is Fisher consistent, and the sample version is strong and root-n consistent under some conditions. Three approximate algorithms for computing LST are proposed and tested in synthetic and real data examples. These experiments indicate that two of the algorithms can compute LST estimator very fast and with relatively...
General Trimmed Estimation: Robust Approach to Nonlinear and Limited Dependent Variable Models
Econometric Theory, 2008
High breakdown-point regression estimators protect against large errors and data contamination. Motivated by some -the least trimmed squares and maximum trimmed likelihood estimators -we propose a general trimmed estimator, which unifies and extends many existing robust procedures. We derive here the consistency and rate of convergence of the proposed general trimmed estimator under mild β-mixing conditions and demonstrate its applicability in nonlinear regression, time series, limited dependent variable models, and panel data.
Adaptive trimmed likelihood estimation in regression
Discussiones Mathematicae Probability and Statistics, 2010
In this paper we derive an asymptotic normality result for an adaptive trimmed likelihood estimator of regression starting from initial high breakdownpoint robust regression estimates. The approach leads to quickly and easily computed robust and efficient estimates for regression. A highlight of the method is that it tends automatically in one algorithm to expose the outliers and give least squares estimates with the outliers removed. The idea is to begin with a rapidly computed consistent robust estimator such as the least median of squares (LMS) or least trimmed squares (LTS) or for example the more recent MM estimators of Yohai. Such estimators are now standard in statistics computing packages, for example as in SPLUS or R. In addition to the asymptotics we provide data analyses supporting the new adaptive approach. This approach appears to work well on a number of data sets and is quicker than the related brute force adaptive regression approach described in Clarke . This current approach builds on the work of Bednarski and Clarke (2002) which considered the asymptotics for the location estimator only.
The Comparing of S-estimator and M-estimators in Linear Regression
Gazi University Journal of Science, 2011
In the presence of outliers, least squares estimation is inefficient and can be biased. In the 1980's several alternatives to M-estimation were proposed as attempts to overcome the lack of resistance. Least Trimmed Squares (LTS) is a viable alternative and is presently the preferred choice of Ryan (1997, 2008). Another proposed solution was S-estimation. This method finds a line that minimizes a robust estimate of the scale of the residuals. This method is highly resistant to leverage points, and is robust to outliers in the response. However, this method was also found to be inefficient.
2020
Robust regression is a regression method used when the remainder's distribution is not reasonable, or there is an outreach to observational data that affects the model. One method for estimating regression parameters is the Least Squares Method (MKT). The method is easily affected by the presence of outliers. Therefore we need an alternative method that is robust to the presence of outliers, namely robust regression. Methods for estimating robust regression parameters include Least Trimmed Square (LTS) and Least Median Square (LMS). These methods are estimators with high breakdown points for outlier observational data and have more efficient algorithms than other estimation methods. This study aims to compare the regression models formed from the LTS and LMS methods, determine the efficiency of the model formed, and determine the factors that influence the production of community oil palm in Langkat District in 2018. The results showed that in testing, the estimated model of the...