Generalized Least Squares (GLS vs OLS) (2022)

The usual Ordinary Least Squares (OLS) method assigns equal weight (or importance) to each observation. But generalized least squares (GLS) take such information into account explicitly and are therefore capable of producing BLUE estimators. Both GLS and OLS are regression techniques used to fit a line to data points and estimate the relationship between a dependent variable ($y$) and one or more independent variables ($X$).

Consider following two-variable model,

begin{align}
Y_i &= beta_1 + beta_2 X_i + u_inonumber\
text{or}\
Y_i &= beta_1X_{0i} + beta_2 X_i + u_i, tag*{(eq1)}
end{align}

where $X_{0i}=1$ for each $i$.

Generalized Least Squares (GLS)

Assume that the heteroscedastic variance $sigma_i^2$ is known:

begin{align}
frac{Y_i}{sigma_i} &= beta_1 left(frac{X_{0i}}{sigma_i} right)+beta_2 left(frac{X_i}{sigma_i}right) +left(frac{u_i}{sigma_i}right)\nonumber
Y_i^* &= beta_i^* X_{0i}^* + beta_2^* X_i^* + u_i^*, tag*{(eq2)}
end{align}

where the stared variables (variables with stars on them) are the original variable divided by the known $sigma_i$. The stared coefficients are the transformed model’s parameters, distinguishing them from OLS parameters $beta_1$ and $beta_2$.

begin{align*}
Var(u_i^*) &=E(u_i^{2*})=Eleft(frac{u_i}{sigma_i}right)^2\
&=frac{1}{sigma_i^2}E(u_i^2) tag*{$because E(u_i)=0$}\
&=frac{1}{sigma_i^2}sigma_i^2 tag*{$because E(u_i^2)=sigma_i^2$}=1, text{which is a constant.}
end{align*}

The variance of the transformed $u_i^*$ is now homoscedastic. Applying OLS to the transformed model (eq2) will produce estimators that are BLUE, that is, $beta_1^*$ and $beta_2^*$ are now BLUE while $hat{beta}_1$ and $hat{beta}_2$ not.

Generalized Least Squares (GLS) Method

The procedure of transforming the original variable in such a way that the transformed variables satisfy the assumption of the classical model and then applying OLS to them is known as the Generalized Least Squares (GLS) method.

The Generalized Least Squares (GLS) are Ordinary Least squares (OLS) on the transformed variables that satisfy the standard LS assumptions. The estimators obtained are known as GLS estimators and are BLUE.

To obtain a Generalized Least Squares Estimator we minimize

begin{align}
sum hat{u}_i^{*2} &= sum left(Y_i^* =hat{beta}_1^* X_{0i}^* – hat{beta}_2^* X_i^* right)^2nonumber\
text{That is}\
sum left(frac{hat{u}_i}{sigma_i}right)^2 &=sum left[frac{Y_i}{sigma_i} – hat{beta}_1^* left(frac{X_{0i}}{sigma_i}right) -hat{beta}_2^*left(frac{X_i}{sigma_i}right) right]^2 tag*{(eq3)}\
sum w_i hat{u}_i^2 &=sum w_i(Y_i-hat{beta}_1^* X_{0i} -hat{beta}_2^*X_i)^2 tag*{(eq4)}
end{align}

The GLS estimator of $hat{beta}_2^*$ is

begin{align*}
hat{beta}_2^* &= frac{(sum w_i)(sum w_i X_iY_i)-(sum w_i X_i)(sum w_iY_i) }{(sum w_i)(sum w_iX_i^2)-(sum w_iX_i)^2} \
Var(hat{beta}_2^*) &=frac{sum w_i}{(sum w_i)(sum w_iX_i^2)-(sum w_iX_i)^2},\
text{where $w_i=frac{1}{sigma_i^2}$}
end{align*}

Difference between GLS and OLS

In GLS, a weighted sum of residual squares is minimized with $w_i=frac{1}{sigma}_i^2$ acting as the weights, but in OLS an unweighted (or equally weighted residual sum of squares) is minimized. From equation (eq3), in GLS the weight assigned to each observation is inversely proportional to its $sigma_i$, that is, observations coming from a population with larger $sigma_i$ will get relatively smaller weight, and those from a population with $sigma_i$ will get proportionately larger weight in minimizing the RSS (eq4).

Since equation (eq4) minimized a weighted RSS, it is known as weighted least squares (WLS), and the estimators obtained are known as WLS estimators.

GLS Method

The generalized Least Squares method is a powerful tool for handling correlated and heteroscedastic errors. This method is also widely used in econometrics, finance, and other fields where regression analysis is applied to real-world data with complex error structures.

The summary of key differences between GLS and OLS methods are

FeatureGLS MethodOLS Method
AssumptionsCan handle Heteroscedasticity, AutocorrelationHomoscedasticity, and Error Term Independent
MethodMinimizes weighted sum of Squares of residualsMinimzes the sum of squares of residuals
BenefitsMore efficient estimates (if assumptions are met)Simpler to implement
DrawbacksMore complex, requires error covariance matrix estimationCan be inefficient when assumptions are violated

Remember, diagnosing issues (violation of assumptions) like heteroscedasticity and autocorrelation are often performed after an initial OLS fit. This can help to decide if GLS or other robust regression techniques are necessary. Therefore, the choice among OLS and GLS depends on the data characteristics and sample size.

Read about the Residual Plot

R and Data Analysis

MCQs English

White Test of Heteroscedasticity Detection (2022)

The post is about the White test of heteroscedasticity.

One important assumption of Regression is that the variance of the Error Term is constant across observations. If the error has a constant variance, then the errors are called homoscedastic, otherwise heteroscedastic. In the case of heteroscedastic errors (non-constant variance), the standard estimation methods become inefficient. Typically, to assess the assumption of homoscedasticity, residuals are plotted.

White test of Heteroscedasticity

White test (Halbert White, 1980) proposed a test that is very similar to that by Breusch-Pagen. The White test of Heteroscedasticity is general because it does not rely on the normality assumptions and it is also easy to implement. Because of the generality of White’s test, it may identify the specification bias too. Both the White test of heteroscedasticity and the Breusch-Pagan test are based on the residuals of the fitted model.

To test the assumption of homoscedasticity, one can use auxiliary regression analysis by regressing the squared residuals from the original model on the set of original regressors, the cross-products of the regressors, and the squared regressors.

The step-by-step procedure for performing the White test of Heteroscedasticity is as follows:

Consider the following Linear Regression Model (assume there are two independent variables)
\[Y_i=\beta_0+\beta_1X_{1i}+\beta_1X_{2i}+e_i \tag{1} \]

For the given data, estimate the regression model, and obtain the residuals $e_i$’s.

Note that the regression of residuals can take linear or non-linear functional forms.

  1. Now run the following regression model to obtain squared residuals from original regression on the original set of the independent variable, the square value of independent variables, and the cross-product(s) of the independent variable(s) such as
    \[Y_i=\beta_0+\beta_1X_1+\beta_2X_2+\beta_3X_1^2+\beta_4X_2^2+\beta_5X_1X_2 \tag{2}\]
  2. Find the $R^2$ statistics from the auxiliary regression in step 2.
    You can also use the higher power regressors such as the cube. Also, note that there will be a constant term in equation (2) even though the original regression model (1)may or may not have the constant term.
  3. Test the statistical significance of \[n \times R^2\sim\chi^2_{df}\tag{3},\] under the null hypothesis of homoscedasticity or no heteroscedasticity, where df is the number of regressors in equation (2)
  4. If the calculated chi-square value obtained in (3) is greater than the critical chi-square value at the chosen level of significance, reject the hypothesis of homoscedasticity in favor of heteroscedasticity.
Heteroscedasticity Patterns: White Test of Heteroscedasticity

For several independent variables (regressors) model, introducing all the regressors, their square or higher terms, and their cross products, consume degrees of freedom.

In cases where the White test statistics are statistically significant, heteroscedasticity may not necessarily be the cause, but specification errors. In other words, “The white test can be a test of heteroscedasticity or specification error or both”. If no cross-product terms are introduced in the White test procedure, then this is a pure test of pure heteroscedasticity.
If the cross-product is introduced in the model, then it is a test of both heteroscedasticity and specification bias.

White Test of Heteroscedasticity Detection

By employing the White test of heteroscedasticity, one can gain valuable insights about the presence of heteroscedasticity and decide on appropriate corrective measures (like Weighted Least Squares (WLS)) if necessary to ensure reliable standard errors and hypothesis tests in your regression analysis.

Summary

The White test of heteroscedasticity is a flexible approach that can be used to detect various patterns of heteroscedasticity. This test indicates the presence of heteroscedasticity but it does not pinpoint the specific cause (like model misspecification). The White test is relatively easy to implement in statistical software.

References

  • H. White (1980), “A heteroscedasticity Consistent Covariance Matrix Estimator and a Direct Test of Heteroscedasticity”, Econometrica, Vol. 48, pp. 817-818.
  • https://en.wikipedia.org/wiki/White_test

Click Links to learn more about Tests of Heteroscedasticity: Regression Residuals Plot, Bruesch-Pagan Test, Goldfeld-Quandt Test

See the Numerical Example of the White Test of Heteroscedasticity

Visit: https://gmstat.com

Breusch Pagan Test for Heteroscedasticity (2021)

The Breusch Pagan test (named after Trevor Breusch and Adrian Pagan) is used to check for the presence of heteroscedasticity in a linear regression model.

Assume our regression model is $Y_i = \beta_1 + \beta_2 X_{2i} + \mu_i$ i.e we have simple linear regression model, and $E(u_i^2)=\sigma_i^2$, where $\sigma_i^2=f(\alpha_1 + \alpha_2 Z_{2i})$,

That is $\sigma_i^2$ is some function of the non-stochastic variable $Z$’s. The $f()$ allows for both the linear and non-linear forms of the model. The variable $Z$ is the independent variable $X$ or it could represent a group of independent variables other than $X$.

Step to Perform Breusch Pagan test

  1. Estimate the model by OLS and obtain the residuals $\hat{u}_1, \hat{u}_2+\cdots$
  2. Estimate the variance of the residuals i.e. $\hat{\sigma}^2=\frac{\sum e_i^2}{(n-2)}$
  3. Run the regression $\frac{e_i^2}{\hat{\sigma^2}}=\beta_1+\beta_2 Z_i + u_i$ and compute the explained sum of squares (ESS) from this regression
  4. Test the statistical significance of $\frac{ESS}{2}$ by $\chi^2$-test with 1 df at the appropriate level of significance ($\alpha$).
  5. Reject the hypothesis of homoscedasticity in favour of heteroscedasticity if $\frac{ESS}{2} > \chi^2_{(1)}$ at the appropriate level of $\alpha$.
Bruesch-Pagan-Test-of-Heteroscedasticity

Note that the

  • The Breusch Pagan test is valid only if $u_i$’s are normally distributed.
  • For k independent variables, $\frac{ESS}{2}$ has ($\chi^2$) Chi-square distribution with k degree of freedom.
  • If the $u_i$’s (error term) are not normally distributed, the White test is used.

If heteroscedasticity is detected, remedies may include using robust standard errors, transforming the data, or employing weighted least squares estimation to adjust for heteroscedasticity.

The Breusch Pagan test is considered a useful tool for detecting the presence of heteroscedasticity in the regression models. The Breusch Pagan Test helps to ensure the validity of statistical inference and estimation.

A sample of Stata output related to the Breusch-Pagan Test for the detection of heteroscedasticity.

Breusch Pagan Test Stata Output

By analyzing the p-value of the chi-squared test statistic from the second regression, one can decide whether to reject the null hypothesis of homoscedasticity. If the p-value is lower than the chosen level of significance (say, 0.05), one has the evidence of heteroscedasticity.

The following are important points that need to be considered while using Breusch Pagan test of Heteroscedasticity.

  • The Breusch-Pagan test can be sensitive to the normality of the error terms. Therefore, It is advisable to check if the residuals are normally distributed before running the Breusch-Pagan test.
  • There are other tests for heteroscedasticity, but the Breusch-Pagan test is a widely used and relatively straightforward option.
Breusch Pagan Test of Heteroscedasticity

References:

  • Breusch, T.S.; Pagan, A.R. (1979). “Simple test for heteroscedasticity and random coefficient variation”. Econometrica (The Econometric Society) 47 (5): 1287–1294.

See the Numerical Example of the Breusch-Pagan Test for the Detection of Heteroscedasticity

R Frequently Asked Questions

Key Points of Heteroscedasticity (2021)

The following are some key points about heteroscedasticity. These key points are about the definition, example, properties, assumptions, and tests for the detection of heteroscedasticity (detection of hetero in short).

One important assumption of Regression is that the

One important assumption of Regression is that the variance of the Error Term is constant across observations. If the error has a constant variance, then the errors are called homoscedastic, otherwise heteroscedastic. In the case of heteroscedastic errors (non-constant variance), the standard estimation methods become inefficient. Typically, to assess the assumption of homoscedasticity, residuals are plotted.

Heteroscedasticity

  • The disturbance term of OLS regression $u_i$ should be homoscedastic. By Homo, we mean equal, and scedastic means spread or scatter.
  • By hetero, we mean unequal.
  • Heteroscedasticity means that the conditional variance of $Y_i$ (i.e., $var(u_i))$ conditional upon the given $X_i$ does not remain the same regardless of the values taken by the variable $X$.
  • In case of heteroscedasticity $E(u_i^2)=\sigma_i^2=var(u_i^2)$, where $i=1,2,\cdots, n$.
  • In case of Homoscedasticity $E(u_i^2)=\sigma^2=var(u_i^2)$, where $i=1,2,\cdots, n$
  • Homoscedasticity means that the conditional variance of $Y_i$ (i.e. $var(u_i))$ conditional upon the given $X_i$ remains the same regardless of the values taken by the variable $X$.
  • The error terms are heteroscedastic, when the scatter of the errors is different, varying depending on the value of one or more of the explanatory variables.
  • Heteroscedasticity is a systematic change in the scatteredness of the residuals over the range of measured values
  • The presence of outliers may be due to (i) The presence of outliers in the data, (ii) incorrect functional form of the regression model, (iii) incorrect transformation of the data, and (iv) missing observations with different measures of scale.
  • The presence of hetero does not destroy the unbiasedness and consistency of OLS estimators.
  • Hetero is more common in cross-section data than time-series data.
  • Hetero may affect the variance and standard errors of the OLS estimates.
  • The standard errors of OLS estimates are biased in the case of hetero.
  • Statistical inferences (confidence intervals and hypothesis testing) of estimated regression coefficients are no longer valid.
  • The OLS estimators are no longer BLUE as they are no longer efficient in the presence of hetero.
  • The regression predictions are inefficient in the case of hetero.
  • The usual OLS method assigns equal weights to each observation.
  • In GLS the weight assigned to each observation is inversely proportional to $\sigma_i$.
  • In GLS a weighted sum of squares is minimized with weight $w_i=\frac{1}{\sigma_i^2}$.
  • In GLS each squared residual is weighted by the inverse of $Var(u_i|X_i)$
  • GLS estimates are BLUE.
  • Heteroscedasticity can be detected by plotting an estimated $u_i^2$ against $\hat{Y}_i$.
  • Plotting $u_i^2$ against $\hat{Y}_i$, if no systematic pattern exists then there is no hetero.
  • In the case of prior information about $\sigma_i^2$, one may use WLS.
  • If $\sigma_i^2$ is unknown, one may proceed with heteroscedastic corrected standard errors (that are also called robust standard errors).
  • Drawing inferences in the presence of hetero (or if hetero is suspected) may be very misleading.

MCQs Online Website with Answers: https://gmstat.com

R Frequently Asked Questions