heteroskedasticityrobust
Heteroskedasticity‑robust refers to statistical methods that remain valid when the assumption of constant variance (homoskedasticity) of the error terms in a regression model is violated. Classical ordinary least squares (OLS) estimates of coefficients are unbiased even under heteroskedasticity, but the usual variance estimates and t‑statistic calculations become inconsistent. This leads to unreliable hypothesis tests and confidence intervals. Heteroskedasticity‑robust techniques adjust the estimated covariance matrix of the parameter vector to account for non‑constant variance, allowing correct inference while retaining the OLS coefficient estimates.
The most common implementation is the “sandwich” estimator introduced by White. It weights the contribution of
Beyond linear regression, heteroskedasticity‑robust methods extend to generalized linear models, time‑series models, and panel data. In
Researchers routinely report robust standard errors when heteroskedasticity is suspected or detected via diagnostic tests such