Leverage and Covariance Matrix Estimation in Finite-Sample IV Regressions (original) (raw)

Improved heteroscedasticity‐consistent covariance matrix estimators

2000

The heteroskedasticity-consistent covariance matrix estimator proposed by , also known as HC0, is commonly used in practical applications and is implemented into a number of statistical software. Cribari-Neto, Ferrari and Cordeiro have developed a bias-adjustment scheme that delivers bias-corrected White estimators. There are several variants of the original White estimator that are also commonly used by practitioners.

HETEROSCEDASTICITY OF UNKNOWN FORM: A COMPARISON OF FIVE HETEROSCEDASTICITY-CONSISTENT COVARIANCE MATRIX (HCCM) ESTIMATORS

Regression model applications frequently involve violations of the homoscedasticity assumption and the presence of high leverage points (HLPs). The Heteroscedasticity-Consistent Covariance Matrix (HCCM) estimator's impact in the presence of heteroscedasticity of an unknown form was investigated in this study. The effectiveness of five variations of HCCM namely White's estimator (HC0), White-Hinkley (HC1), Mackinnon White (HC2), Davison-Mackinnon (HC3), and Cribari-Neto (HC4) were accessed to identify the optimal Heteroscedasticity-Consistent Covariance Matrix (HCCM) estimator. In the study a simulated dataset was analysed using the Econometric View Software Version 12. The Breush-Pagan Godfery's test for heteroscedasticity was applied and p-value of 0.0123 was obtained indicating presence of heteroscedasticity in the model. Applying the HCCM estimators and comparing the Heteroskedasticity-consistent standard errors estimates showed that HCO had 124.104, HC1 had 1189.222, HC2 had 1175.282, HC3 had 1106.94 and HC4 had 1140.707. These results reveal that HC3 and HC4 produced smaller errors compared to HC0, HC1 and HC2. The study hence comes to the conclusion that when doing inferential tests using OLS regression, the use of HCSE estimator increases the researcher's confidence in the accuracy and potency of those tests. This study therefore suggests that to ensure that findings are not affected by heteroscedasticity; researchers should use HCCM estimator but precisely HC3 and HC4, as the presented better results in comparison to others.

Heteroskedasticity-Robust Inference in Linear Regressions

Communications in Statistics-simulation and Computation, 2010

The assumption that all errors share the same variance (homoskedasticity) is commonly violated in empirical analyses carried out using the linear regression model. A widely adopted modeling strategy is to perform point estimation by ordinary least squares and then perform testing inference based on these point estimators and heteroskedasticity-consistent standard errors. These tests, however, tend to be size-distorted when the sample size is small and the data contain atypical observations. Furno (1996) suggested performing point estimation using a weighted least squares mechanism in order to attenuate the effect of leverage points on the associated inference. In this article, we follow up on her proposal and define heteroskedasticity-consistent covariance matrix estimators based on residuals obtained using robust estimation methods. We report Monte Carlo simulation results (size and power) on the finite sample performance of different heteroskedasticity-robust tests. Overall, the results favor inference based on HC0 tests constructed using robust residuals.

Using Heteroscedasticity Consistent Standard Errors in the Linear Regression Model

The American Statistician, 2000

In the presence of heteroscedasticity, OLS estimates are unbiased, but the usual tests of significance are generally inappropriate and their use can lead to incorrect inferences. Tests based on a heteroscedasticity consistent covariance matrix (HCCM), however, are consistent even in the presence of heteroscedasticity of an unknown form. Most applications that use a HCCM appear to rely on the asymptotic version known as HC0. Our Monte Carlo simulations show that HC0 often results in incorrect inferences when N ≤ 250, while three relatively unknown, small sample versions of the HCCM, and especially a version known as HC3, work well even for N 's as small as 25. We recommend that: 1) data analysts should correct for heteroscedasticity using a HCCM whenever there is reason to suspect heteroscedasticity; 2) the decision to use a HCCM-based tests should not be determined by a screening test for heteroscedasticity; and 3) when N ≤ 250, the HCCM known as HC3 should be used. Since HC3 is simple to compute, we encourage authors of statistical software to add this estimator to their programs.

Inference under heteroskedasticity and leveraged data

2007

We evaluate the finite-sample behavior of different heteroskedasticity-consistent covariance matrix estimators, under both constant and unequal error variances. We consider the estimator proposed by Halbert White (HC0), and also its variants known as HC2, HC3, and HC4; the latter was recently proposed by . We propose a new covariance matrix estimator: HC5. It is the first consistent estimator to explicitly take into account the effect that the maximal leverage has on the associated inference. Our numerical results show that quasi-t inference based on HC5 is typically more reliable than inference based on other covariance matrix estimators.

A Minimax Bias Estimator for Ols Variances Under Heteroskedasticity

SSRN Electronic Journal, 2000

Analytic evaluation of heteroskedasticity consistent covariance matrix estimates (HCCME) is difficult because of the complexity of the formulae currently available. We obtain new analytic formulae for the bias of a class of estimators of the covariance matrix of OLS in a standard linear regression model. These formulae provide substantial insight into the properties and performance characteristics of these estimators. In particular, we find a new estimator which minimizes the maximum possible bias and improves substantially on the standard Eicker-White estimate.

A sequence of improved standard errors under heteroskedasticity of unknown form

Fuel and Energy Abstracts, 2011

The linear regression model is commonly used by practitioners to model the relationship between the variable of interest and a set of explanatory variables. The assumption that all error variances are the same (homoskedasticity) is oftentimes violated. Consistent regression standard errors can be computed using the heteroskedasticity-consistent covariance matrix estimator proposed by White (1980). Such standard errors, however, typically display nonnegligible systematic errors in finite samples, especially under leveraged data. Cribari-Neto et al. (2000) improved upon the White estimator by defining a sequence of bias-adjusted estimators with increasing accuracy. In this paper, we improve upon their main result by defining an alternative sequence of adjusted estimators whose biases vanish at a much faster rate. Hypothesis testing inference is also addressed. An empirical illustration is presented.

Simulation‐based tests for heteroskedasticity in linear regression models: Some further results

The Econometrics …, 2006

As shown by Dufour, Khalaf, Brenard and Genest, exact tests for heteroskedasticity can be obtained, by using Monte Carlo (MC) techniques, if it is assumed that the true form of the error distribution under the null hypothesis is known. The corresponding nonparametric bootstrap tests are only asymptotically valid, but do not require speci…cation of the error law. Since information about the precise form of the error distribution is not often available to applied workers, two questions merit attention. First, how robust are MC tests to incorrect assumptions about the error distribution? Second, are nonparametric bootstrap tests markedly inferior to MC tests when the latter use the correct error distribution? Results relevant to these two questions are derived using asymptotic analysis and simulation experiments. The evidence suggests that the combination of an asymptotically pivotal test statistic with a nonparametric bootstrap gives a robust and well-behaved procedure.