Adaptive Robust Regression by Using a Nonlinear Regression Program (original) (raw)
Related papers
Robust regression and applications
1996
This study analyzes the effect o f outliers in the regression analysis with the help o f a written program in the programing language of GAUSS. The analysis relies on the subject o f Robust Regression, which is explained and supported by experiments and applications. The applications contain examples to show the superiority o f this technique.
Statistics in Transition New Series
Zaman and Bulut (2018a) developed a class of estimators for a population mean utilising LMS robust regression and supplementary attributes. In this paper, a family of estimators is proposed, based on the adaptation of the estimators presented by Zaman (2019), followed by the introduction of a new family of regression-type estimators utilising robust regression tools (LAD, H-M, LMS, H-MM, Hampel-M, Tukey-M, LTS) and supplementary attributes. The mean square error expressions of the adapted and proposed families are determined through a general formula. The study demonstrates that the adapted class of the Zaman (2019) estimators is in every case more proficient than that of Zaman and Bulut (2018a). In addition, the proposed robust regression estimators based on robust regression tools and supplementary attributes are more efficient than those of Zaman and Bulut (2018a) and Zaman (2019).The theoretical findings are supported by real-life examples.
A review of some recent developments in robust regression
British Journal of Mathematical and Statistical Psychology, 1996
In situations where the goal is to understand how a random variable y is related to a set of p predictor variables, modem robust regression methods can be invaluable. One reason is that even one unusual value in the design space, or one outlier among they values, can have a large impact on the ordinary least squares estimate of the parameters of the usual linear model. That is, a single unusual value or outlier can give a highly distorted view of how two or more random variables are related.
Comparison of Robust Regression Methods in Linear Regression
Int. J. Contemp. Math. Sciences, 2011
In classical multiple regression, the ordinary least squares estimation is the best method if assumptions are met to obtain regression weights when analyzing data. However, if the data does not satisfy some of these assumptions, then sample estimates and results can be misleading. Especially, outliers violate the assumption of normally distributed residuals in the least squares regression. The danger of outlying observations, both in the direction of the dependent and explanatory variables, to the least squares regression is that they can have a strong adverse effect on the estimate and they may remain unnoticed. Therefore, statistical techniques that are able to cope with or to detect outlying observations have been developed. Robust regression is an important method for analyzing data that are contaminated with outliers. It can be used to detect outliers and to provide resistant results in the presence of outliers. The purpose of this study is to define behavior of outliers in linear regression and to compare some of robust regression methods via simulation study. The simulation study is used in determining which methods best in all of the linear regression scenarios.
Partially adaptive robust estimation of regression models and applications
European Journal of Operational Research, 2006
This paper provides an accessible exposition of recently developed partially adaptive estimation methods and their application. These methods are robust to thick-tailed or asymmetric error distributions and should be of interest to researchers and practitioners in data mining, agent learning, and mathematical modeling in a wide range of disciplines. In particular, partially adaptive estimation methods can serve as robust alternatives to ordinary regression analysis, as well as machine learning methods developed by the artificial intelligence and computing communities. Results from analysis of three problem domains demonstrate application of the theory.
A robust regression technique using compound estimation
Naval Research Logistics, 1998
Least squares fitting of regression models is a widely used technique. The presence of outliers in the data can have an adverse effect on the method of least squares, resulting in a model that does not adequately fit to the bulk of the data. For this situation, robust regression techniques have been proposed as an improvement to the method of least squares. We propose a robust regression procedure that performs well relative to the current robust methods against a variety of dataset types. Evaluations are performed using datasets without outliers (testing efficiency), with a large percentage of outliers (testing breakdown), and with high leverage outliers (testing bounded influence). The datasets are based on 2-level factorial designs that include axial points to evaluate leverage effects. A Monte Carlo simulation approach is used to evaluate the estimating capability of the proposed procedure relative to several competing methods. We also provide an application to estimating costs for government satellites.