The assumptions about the random variable *μ* (error term) is that its probability distribution remains the same for all observations of *X* and in particular that the variance of each *μ* is the same for all values of the explanatory variables, i.e the variance of errors is the same across all levels of the independent variables. Symbolically it can be represented as

$Var(\mu) = E\{\mu_i – E(\mu)\}^2 = E(\mu_i)^2 = \sigma_\mu^2 = \mbox(Constant)$

This assumption is known as the assumption of homoscedasticity or the assumption of constant variance of the error term *μ*‘s. It means that the variation of each * **μ _{i}* around

*its zero means does not depend on the values of*

*X*(independent) because error term expresses the influence on the dependent variables due to

- Errors in measurement

The errors of measurement tend to be cumulative over time. It is also difficult to collect the data and check its consistency and reliability. So the variance of*μ*increases with increasing the values of_{i}*X*. - Omitted variables

Omitted variables from the function (regression model) tends to change in the same direction with*X*, causing an increase of the variance of the observation from the regression line.

The variance of each *μ _{i}* remains the same irrespective of small or large values of the explanatory variable i.e. $\sigma_\mu^2$ is not function of

*X*i.e $\sigma_{\mu_i^2} \ne f(X_i)$.

_{i}**Consequences if Homoscedasticity is not meet**

**Consequences if Homoscedasticity is not meet**

If the assumption of homoscedastic disturbance (Constant Variance) is not fulfilled, following consequence we have

- We cannot apply the formula of the variance of the coefficient to conduct tests of significance and construct confidence intervals. The tests are inapplicable $Var(\hat{\beta}_0)=\sigma_\mu^2 \{\frac{\sum X^2}{n \sum X^2}\}$ and $Var(\hat{\beta}_1) = \sigma_\mu^2 \{\frac{1}{\sum X^2}\}$
- If
*μ*(error term) is heteroscedastic the OLS (Ordinary Least Square) estimates do not have minimum variance property in the class of Unbiased Estimators i.e they are inefficient in small samples. Furthermore they are inefficient in large samples (asymptotically inefficient). - The coefficient estimates would still be statistically unbiased even if the
*μ*‘s are heteroscedastic. The $\hat{\beta}$’s will have no statistical bias i.e $E(\beta_i)=\beta_i$ (coefficient’s expected values will be equal to the true parameter value). - The prediction would be inefficient, because of the variance of prediction includes the variance of
*μ*and of the parameter estimates which are not minimal due to the incidence of heteroscedasticity i.e. The prediction of*Y*for a given value of*X*based on the estimates $\hat{\beta}$’s from the original data, would have a high variance.

**Tests for Homoscedasticity**

**Tests for Homoscedasticity**

Some tests commonly used for testing the assumption of homoscedasticity are:

- Spearman Rank-Correlation test
- Goldfeld and Quandt test
- Glejser test
- Breusch–Pagan test
- Bartlett’s test of Homoscedasticity

**Reference:**

A. Koutsoyiannis (1972). “Theory of Econometrics”. 2nd Ed.

**Download pdf file:** **Homoscedasticity 105.72 KB**

**Homoscedasticity 105.72 KB**

#### Incoming search terms:

- homoscedasticity
- ols assumptions
- In OLS the mean of the error term is
- In Regression Analysis if the variance of the error term is constant we call it theHeteroscedasticity property
- assumptions of OLS
- constant variance assumption
- constant variance
- constant variance of errors
- constant error variance graph
- Another term for constant error variance is