Study of Some Kinds of Ridge Regression Estimators in Linear Regression Model (original) (raw)
Related papers
البحث الرابع 2013 Ridge regression estimators with the problem of Multicollinearity
The study aims to illustrate the negative effect of the Multicollinearity problem upon the specimen, identify the way of Ridge Regression as a way to deal with the Multicollinearity problem, focus on some of the estimators of Ridge regression as (James and Stein, Bhattacharya, Heuristic) and identify which estimator from the previously mentioned estimators is highly preferable to be used, to estimate the parameters of a model which faces the Multicollinearity problem. Minimum mean-square error (MSE) has been used as the best measure for estimator. Application has been done on specific data for return on total assets of a bank after making sure that this data faces the Multicollinearity problem. Also, simulation method was used to generate fabricated data sets, which gave more space in the application. According to the study we can see that James and Stein's estimator has got the minimum mean square error (MSE). Consequently the study recommends its usage to estimate model parameters which face the Multicollinearity problem.
A New Ridge-Type Estimator for the Linear Regression Model: Simulations and Applications
Scientifica
The ridge regression-type (Hoerl and Kennard, 1970) and Liu-type (Liu, 1993) estimators are consistently attractive shrinkage methods to reduce the effects of multicollinearity for both linear and nonlinear regression models. This paper proposes a new estimator to solve the multicollinearity problem for the linear regression model. Theory and simulation results show that, under some conditions, it performs better than both Liu and ridge regression estimators in the smaller MSE sense. Two real-life (chemical and economic) data are analyzed to illustrate the findings of the paper.
Ridge regression estimators with the problem of multicollinearity
Applied mathematical sciences, 2013
The study aims to illustrate the negative effect of the Multicollinearity problem upon the specimen, identify the way of Ridge Regression as a way to deal with the Multicollinearity problem, focus on some of the estimators of Ridge regression as (James and Stein, Bhattacharya, Heuristic) and identify which estimator from the previously mentioned estimators is highly preferable to be used, to estimate the parameters of a model which faces the Multicollinearity problem. Minimum mean-square error (MSE) has been used as the best measure for estimator. Application has been done on specific data for return on total assets of a bank after making sure that this data faces the Multicollinearity problem. Also, simulation method was used to generate fabricated data sets, which gave more space in the application. According to the study we can see that James and Stein's estimator has got the minimum mean square error (MSE). Consequently the study recommends its usage to estimate model parameters which face the Multicollinearity problem.
A NEW OPTIMAL SELECTION OF A RIDGE REGRESSION PARAMETER
In this paper, a new ridge regression parameter is proposed for the ridge regression estimator. The idea of this proposal is based on combining the best well known ridge parameter estimators and sample size. The superiority of the new biasing parameter estimator is investigated through the mean squared error criterion (MSE) criterion, and the relative efficiency criterion. Therefore, simulation experiments are conducted and found that the new ridge estimator has less bias, and smallest MSE, in all degrees of multicollinearity, in all levels of error variances, and for all sample sizes. Also, it is found that the length of the new proposed estimator converges to the length of the true regression parameter. The results of the new ridge estimator are compared to the ordinary least squares (OLS) estimator and all the ridge estimators considered in the new estimator and some other recent ridge estimators. It can be concluded that our new estimator has a little bias and it is a consistent estimator than other ridge estimators considered in this paper and the OLS estimator.
Performance of a new ridge regression estimator
Journal of the Association of Arab Universities for Basic and Applied Sciences, 2010
Ridge regression estimator has been introduced as an alternative to the ordinary least squares estimator (OLS) in the presence of multicollinearity. Several studies concerning ridge regression have dealt with the choice of the ridge parameter. Many algorithms for the ridge parameter have been proposed in the statistical literature. In this article, a new method for estimating ridge parameter is proposed. A simulation study has been made to evaluate the performance of the proposed estimator based on the mean squared error (MSE) criterion. The evaluation has been done by comparing the MSEs of the proposed estimator with other well-known estimators. In the presence of multicollinearity, the simulation study indicates that under certain conditions the proposed estimator performs better than other estimators.
The Performance Of Some Biased Estimators With Different Biased Parameter In Linear Regression Model
Al-Qadisiyah Journal of Pure Science
To circumvent the problem of multicollinearity, biased estimation method has been suggested to improve the precision of estimators. In this paper, we study types of biased estimators that can help to reduce the effect of multicollinearity on estimation. A simulation study is carried out to study the relative effectiveness of certain types of biased estimators in comparison to some proposed estimated ridge parameter (k) that have been shown in the literature . Moreover, a real data set has been considered to support the simulation results based on the estimated mean square error criterion.
A Comparative Study on the Performance of New Ridge Estimators
Pakistan Journal of Statistics and Operation Research, 2016
Least square estimators in multiple linear regressions under multicollinearity become unstable as they produce large variance for the estimated regression coefficients. Hoerl and Kennard 1970, developed ridge estimators for cases of high degree of collinearity. In ridge estimation, the estimation of ridge parameter (k) is vital. In this article new methods for estimating ridge parameter are introduced. The performance of the proposed estimators is investigated through mean square errors (MSE). Monte-Carlo simulation technique indicated that the proposed estimators perform better than ordinary least squares (OLS) estimators as well as few other ridge estimators.
Some Ridge Regression Estimators and Their Performances
Journal of Modern Applied Statistical Methods, 2016
The estimation of ridge parameter is an important problem in the ridge regression method, which is widely used to solve multicollinearity problem. A comprehensive study on 28 different available estimators and five proposed ridge estimators, KB1, KB2, KB3, KB4, and KB5, is provided. A simulation study was conducted and selected estimators were compared. Some of selected ridge estimators performed well compared to the ordinary least square (OLS) estimator and some existing popular ridge estimators. One of the proposed estimators, KB3, performed the best. Numerical examples were given.
A Ridge Regression Estimation Approach When Multicollinearity is Present
2016
In regression problems, we usually try to estimate the parameters β in the general linear regression model . u X Y + β = We need a method to estimate the parameter vector . β The most common method is the Ordinary Least Squares (OLS) estimator. However, in the presence of multicollinearity, the OLS efficiency can be radically reduced because of the large variances of the estimates of the regression coefficients. An alternative to the OLS estimator has been recommended by Hoerl and Kennard [3], namely the ridge regression estimator. In this paper, a suggested method of finding the ridge parameter k is investigated and evaluated in terms of Mean Square Error (MSE) by simulation techniques. Results of a simulation study indicate that with respect to MSE criteria, the suggested estimators perform better than both the OLS estimators and the other estimators discussed here.
On Model Selection Criteria of Multivariate Ridge Regression and Ordinary Least Square Regression
Ridge regression is a regression technique that allows for biased estimation of regression parameters that are quite close to the true values in the presence of correlated predictor variables in the model. Therefore, the paper highlight the introduction of ridge regression estimator as an alternative to ordinary least square (OLS) estimator in the presence of multicollinearity and interpretation of model selection criteria in the presence of multicollinearity. In the research, the method focused on application of the ridge regression model to an economic data. Monte Carlo simulations has been made from multivariate normal to compare their characteristics and performance of the model. The analysis shows that based on the mean square errors, and the variance of each model we discovered that all ridge regression models are more effective and better than ordinary least square when the multicollinearity problem is exist. We therefore concluded the best model is the ridge regression because it has smaller MSE and has a smaller variance.