On the Efficiency of Modified Regression-type Estimators Using Robust Regression Slopes and Non-conventional Measures of Dispersion (original) (raw)
Related papers
Modified ratio estimators using robust regression methods, 2018
Whenthereisanoutlierinthedataset,theefficiencyoftraditionalmethodsdecreases.Inordertosolvethisproblem,Kadilaretal.(2007)adapted Huber-M method which is only one of robust regression methods to ratio-typeestimatorsanddecreasedtheeffectofoutlierproblem.Inthis study,newratio-typeestimatorsareproposedbyconsideringTukey-M, HampelM,HuberMM,LTS,LMSandLADrobustmethodsbasedonthe Kadilaretal.(2007).Theoretically,weobtainthemeansquareerror(MSE) fortheseestimators.WecomparedwithMSEvaluesofproposedestimatorsandMSEvaluesofestimatorsbasedonHuber-MandOLSmethods. As a result of these comparisons, we observed that our proposed estimators give more efficient results than both Huber M approach which wasproposedbyKadilaretal.(2007)andOLSapproach.Also,underall conditions,alloftheotherproposedestimatorsexceptLadmethodare more efficient than robust estimators proposed by Kadilar et al. (2007). And,thesetheoreticalresultsaresupportedwiththeaidofanumerical exampleandsimulationbybasingondatathatincludesanoutlier.
Mathematical Problems in Engineering
Many authors defined the modified version of the mean estimator by using two auxiliary variables. These proposed estimators highly depend on the calculated regression coefficients. In the presence of outliers, these estimators do not give satisfactory results. In this study, we improve the suggested estimators using several robust regression techniques while obtaining the regression coefficients. We compared the efficiencies between the suggested estimators and the estimators presented in the literature. We used two numerical examples and a simulation study to support these theoretical results. Empirical results show that the modified ratio estimator performs well in the presence of outliers when adopting robust regression techniques.
2020
Robust regression is a regression method used when the remainder's distribution is not reasonable, or there is an outreach to observational data that affects the model. One method for estimating regression parameters is the Least Squares Method (MKT). The method is easily affected by the presence of outliers. Therefore we need an alternative method that is robust to the presence of outliers, namely robust regression. Methods for estimating robust regression parameters include Least Trimmed Square (LTS) and Least Median Square (LMS). These methods are estimators with high breakdown points for outlier observational data and have more efficient algorithms than other estimation methods. This study aims to compare the regression models formed from the LTS and LMS methods, determine the efficiency of the model formed, and determine the factors that influence the production of community oil palm in Langkat District in 2018. The results showed that in testing, the estimated model of the...
Leverages, Outliers and the Performance of Robust Regression Estimators
British Journal of Mathematics & Computer Science, 2016
In this study, we assess the performance of some robust regression methods. These are the least-trimmed squares estimator (LTSE), Huber maximum likelihood estimator (HME), S-Estimator (SE) and modified maximum likelihood estimator (MME) which are compared with the ordinary least squares Estimator (OLSE) at different levels of leverages in the predictor variables. Anthropometric data from Komfo Anokye Teaching Hospital (KATH) was used and the comparison is done using root mean square error (RMSE), relative efficiencies (RE), coefficients of determination (R-squared) and power of the test. The results show that robust methods are as efficient as the OLSE if the assumptions of OLSE are met. OLSE is affected by low and high percentage of leverages, HME broke-down with leverages in data. MME and SE are robust to all percentage of aberrations, while LTSE is slightly affected by high percentage leverages perturbation. Thus, MME and SE are the most robust methods, while OLSE and HME are the least robust and the performance of the LTSE is affected by higher percentages of leverage in this study.
Improved regression in ratio type estimators based on robust M-estimation
PLOS ONE
In this article, a new robust ratio type estimator using the Uk’s redescending M-estimator is proposed for the estimation of the finite population mean in the simple random sampling (SRS) when there are outliers in the dataset. The mean square error (MSE) equation of the proposed estimator is obtained using the first order of approximation and it has been compared with the traditional ratio-type estimators in the literature, robust regression estimators, and other existing redescending M-estimators. A real-life data and simulation study are used to justify the efficiency of the proposed estimators. It has been shown that the proposed estimator is more efficient than other estimators in the literature on both simulation and real data studies.
This study compared the performance of some robust regression methods and the Ordinary Least Squares Estimator (OLSE). The estimators were compared using varied levels of leverages and vertical outliers in the predictors and the dependent variables. An anthropometric dataset on total body fat with height, Body Mass Index (BMI), Triceps Skin-fold(TS), and arm fat as percent composition of the body (parmfat), as the predictors. The effects of outliers and leverages on the estimators, were investigated at (5% leverages and 10% vertical outliers, 5% leverages with 15% vertical outliers). The criteria for the comparison: coefficients, Root Mean Square Error (RMSE), Relative Efficiencies (RE), coefficients of determination (R-squared) and power of the test. The findings from this study revealed that, OLSE was affected by both outliers and leverages whilst Huber Maximum likelihood Estimator (HME) was affected by leverages. The Least Trimmed Squares Estimator (LTSE) was slightly affected by high perturbations of outliers and leverages.
Comparison of Regression Estimator and Ratio Estimator: A Simulation Study
2017
We compared ratio and regression estimators empirically based on bias and coefficient of variation. Simulation studies accounting for sampling rate, population size, heterogeneity of the auxiliary variable x, deviation from linearity and model misspecification were conducted. The study shows that ratio estimator is better than regression estimators when regression line is close to the origin. Ratio and regression estimators still work even if there is a weak linear relationship between x and y, provided that there is minimal, if not absent, model misspecification. When the relationship between the target variable and the auxiliary variable is very weak, bootstrap estimates yield lower bias. Regression estimator is generally more efficient than ratio estimator.
British Journal of Mathematics & Computer Science, 2016
The Ordinary Least Squares Estimator (OLSE) is the best method for linear regression if the classical assumptions are satisfied for estimating weights. When these assumptions are violated, the robust methods give more reliable estimates while the OLSE is strongly affected adversely. In order to assess the sensitivity of some estimators using more than five criteria, a secondary dataset on Anthropometric measurements from Komfo Anokye Teaching Hospital, Kumasi-Ghana, is used. In this study, we compare the performance of the Huber Maximum Likelihood Estimator (HMLE), Least Trimmed Squares Estimator (LTSE), S Estimator (SE) and Modified Maximum Likelihood Estimator (MMLE) relative to the OLSE when the dataset has normal errors; 10, 20 and 30 percent outliers; 20% error contamination and lognormal contamination in the response
Regression Estimation in the Presence of Outliers: A Comparative Study
2016
In linear models, the ordinary least squares (OLS) estimators of parameters have always turned out to be the best linear unbiased estimators. However, if the data contain outliers, this may affect the least-squares estimates. So, an alternative approach; the so-called robust regression methods, is needed to obtain a better fit of the model or more precise estimates of parameters. In this article, various robust regression methods have been reviewed. The focus is on the presence of outliers in the y-direction (response direction). Comparison of the properties of these methods is done through a simulation study. The comparison's criteria were the efficiency and breakdown point. Also, the methods are applied to a real data set.
REVIEW OF SOME ROBUST ESTIMATORS IN MULTIPLE LINEAR REGRESSIONS IN THE PRESENCE OF OUTLIER(s
African Journal of Mathematics and Statistics Studies , 2023
Linear regression has been one of the most important statistical data analysis tools. Multiple regression is a type of regression where the dependent variable shows a linear relationship with two or more independent variables. OLS estimate is extremely sensitive to unusual observations (outliers), with low breakdown point and low efficiency. This paper reviews and compares some of the existing robust methods (Least Absolute Deviation, Huber M-Estimator, Bisquare M-Estimator, MM Estimator, Least Median Square, Least Trimmed Square, S-Estimator); a simulation method is used to compare the selected existing methods. It was concluded based on the results that for y direction outlier, the best estimator in terms of high efficiency and breakdown point of at most 0.3 is MM; for x direction outlier, the best estimator in term breakdown point of at most 0.4 is S; for x, y direction outlier, the best estimator in terms of high efficiency and breakdown point of at most 0.2 is MM.