Consequences of Autocorrelation

In this post, we will discuss some important consequences of the existence of autocorrelation in the data. The consequences of the OLS estimators in the presence of Autocorrelation can be summarized as follows:

Consequences of Autocorrelation on OLS Estimators it Exists

  • When the disturbance terms are serially correlated then the OLS estimators of the $\hat{\beta}$s are still unbiased and consistent but the optimist property (minimum variance property) is not satisfied.
  • The OLS estimators will be inefficient and therefore, no longer BLUE.
  • The estimated variance of the regression coefficients will be biased and inconsistent and will be greater than the variances of estimate calculated by other methods, therefore, hypothesis testing is no longer valid. In most of the cases, $R^2$ will be overestimated (indicating a better fit than the one that truly exists). The t- and F-statistics will tend to be higher.
  • The variance of random term $u$ may be under-estimated if the $u$’s are autocorrelated. That is, the random variance $\hat{\sigma}^2=\frac{\sum \hat{u}_i^2}{n-2}$ is likely to be under-estimate the true $\sigma^2$.
  • Among the consequences of autocorrelation another is, if the disturbance terms are autocorrelated then the OLS estimates are not asymptotic. That is $\hat{\beta}$s are not asymptotically efficient.
Consequences Patterns of Autocorrelation and Non-Autocorrelation

Learn about Autocorrelation and Reasons for Autocorrelations

Learn more about Autocorrelation on Wikipedia

MCQs General Knowledge

1 thought on “Consequences of Autocorrelation”

Leave a Comment

Discover more from Statistics for Data Analyst

Subscribe now to keep reading and get access to the full archive.

Continue reading