Consequences of Heteroscedasticity (2013)

When heteroscedasticity is present in data, then estimates based on Ordinary Least Square (OLS) are subjected to

The consequences of Heteroscedasticity are as follows

  1. We cannot apply the formula of the variance of the coefficients to conduct tests of significance and construct confidence intervals.
  2. If the error term (μi) is heteroscedastic, then the OLS estimates do not have the minimum variance property in the class of unbiased estimators, i.e. they are inefficient in small samples. Furthermore, they are asymptotically inefficient. The large standard errors may lead to incorrect conclusions about the statistical significance of the regression coefficients.
  3. The estimated coefficients remain unbiased statistically. That means the property of unbiasedness of OLS estimation is not violated by the presence of heteroscedasticity.
  4. The forecasts based on the model with heteroscedasticity will be less efficient as OLS estimation yields higher values of the variance of the estimated coefficients.
Heteroscedasticity

All this means the standard errors will be underestimated and the t-statistics and F-statistics will be inaccurate, caused by several factors, but the main cause is when the variables have substantially different values for each observation. For instance, GDP will suffer from heteroscedasticity if we include large countries such as the USA and small countries such as Cuba. In this case, it may be better to use GDP per person. Also, note that heteroscedasticity tends to affect cross-sectional data more than time series.

Consider the simple linear regression model (SLRM)

The OLS estimate of β^ and α are

β^=xiyixi2=xi(βxi+ϵi)xi2=βxi2xi2+xiϵixi2=β+xiϵixi2

Applying expectations on both sides we get:

E(β^=β+E(xiϵi)xi2=βE(ϵixi)=0

Similarly

α^=yβ^X=α+βX+ϵβ^X=α+βX+0Xβ=α

For further details about the Consequences of Heteroscedasticity on OLS parameters, see https://itfeature.com/hetero/hetero-intro/heteroscedasticity-consequences/

Hence, the unbiasedness property of OLS estimation is not affected by Heteroscedasticity.

Consequences of Heteroscedasticity
Heteroscedasticity Pattern

Consequences of Heteroscedasticity References

Regression Model Assumptions

Linear Regression Model Assumptions

The linear regression model (LRM) is based on certain statistical assumptions, some of which are related to the distribution of a random variable (error term) ui, some are about the relationship between error term ui and the explanatory variables (Independent variables, X‘s) and some are related to the independent variable themselves. The linear regression model assumptions can be classified into two categories

  1. Stochastic Assumption
  2. None Stochastic Assumptions

These linear regression model assumptions (or assumptions about the ordinary least square method: OLS) are extremely critical to interpreting the regression coefficients.

Regression Model Assumptions
  • The error term (ui) is a random real number i.e. ui may assume any positive, negative, or zero value upon chance. Each value has a certain probability, therefore, the error term is a random variable.
  • The mean value of u is zero, i.e. E(ui)=0 i.e. the mean value of ui is conditional upon the given Xi is zero. It means that for each value of variable Xi, u may take various values, some of them greater than zero and some smaller than zero. Considering all possible values of u for any particular value of X, we have zero mean value of disturbance term ui.
  • The variance of ui is constant i.e. for the given value of X, the variance of ui is the same for all observations. E(ui2)=σ2. The variance of disturbance term (ui) about its mean is at all values of X will show the same dispersion about their mean.
  • The variable ui has a normal distribution i.e. uiN(0,σu2. The value of u (for each Xi) has a bell-shaped symmetrical distribution.
  • The random terms of different observations (ui,uj) are independent i..e E(ui,uj)=0, i.e. there is no autocorrelation between the disturbances. It means that the random term assumed in one period does not depend on the values in any other period.
  • ui and Xi have zero covariance between them i.e. u is independent of the explanatory variable or E(uiXi)=0 i.e. Cov(ui,Xi)=0. The disturbance term u and explanatory variable X are uncorrelated. The u’s and X’s do not tend to vary together as their covariance is zero. This assumption is automatically fulfilled if the X variable is nonrandom or non-stochastic or if the mean of the random term is zero.
  • All the explanatory variables are measured without error. It means that we will assume that the regressors are error-free while y (dependent variable) may or may not include measurement errors.
  • The number of observations n must be greater than the number of parameters to be estimated or the number of observations must be greater than the number of explanatory (independent) variables.
  • The should be variability in the X values. That is X values in a given sample must not be the same. Statistically, Var(X) must be a finite positive number.
  • The regression model must be correctly specified, meaning there is no specification bias or error in the model used in empirical analysis.
  • No perfect or near-perfect multicollinearity or collinearity exists among the two or more explanatory (independent) variables.
  • Values taken by the regressors X are considered to be fixed in repeating sampling i.e. X is assumed to be non-stochastic. Regression analysis is conditional on the given values of the regressor(s) X.
  • The linear regression model is linear in the parameters, e.g. yi=β1+β2xi+ui
regression model Assumptions

Visit MCQs Site: https://gmstat.com

Simple Linear Regression Model (SLRM)

A simple linear regression model (SLRM) is based on a single independent (explanatory) variable and it fits a straight line such that the sum of squared residuals of the regression model (or vertical distances between the fitted line and points of the data set) as small as possible. The simple linear regression model (usually known as a statistical or probabilistic model) is

yi=α+βxi+εiORyi=b0+b1xi+εiORyi=β0+βxi+εi
where y is the dependent variable, x is the independent variable. In the regression context, y is the regressand, and x is the regressor. The epsilon (ε) is unobservable, denoting random error or the disturbance term of a regression model. ε (random error) has some specific importance for its inclusion in the regression model:

Importance of Error Term in Simple Linear Regression Model

  1. Random error (ε) captures the effect on the dependent variable of all variables which are not included in the model under study, because the variable not included in the model may or may not be observable.
  2. Random error (ε) captures any specification error related to the assumed linear-functional form.
  3. Random error (ε) captures the effect of unpredictable random components present in the dependent variable.

We can say that ε is the variation in variabley not explained (unexplained) by the independent variable x included in the model.

In the above equation or model β0^,β1^ are the parameters of the model and our main objective is to obtain the estimates of their numerical values i.e. β0^ and β1^, where β0 is the intercept (regression constant), it passes through the (x,y) i.e. center of mass of the data points and β1 is the slope or regression coefficient of the model and slope is the correlation between variable x and y corrected by the ratio of standard deviations of these variables.

The subscript i denotes the ith value of the variable in the model.
y=β0+β1x1
This is a mathematical model as all the variation in y is due solely to change in x. There are no other factors affecting the dependent variable. If this is true then all the pairs (x,y) will fall on a straight line if plotted on a two-dimensional plane. However, the plot may or may not be a straight line for observed values. A dimensional diagram with points plotted in pair form is called a scatter diagram.

Simple Linear Regression Model scatter with regression line

See Assumptions about Simple Linear Regression Model

FAQs about Simple Linear Regression Models

  1. What is a simple linear regression Model?
  2. What is a Probabilistic/ Statistical model?
  3. What is the equation of a simple linear regression model?
  4. Write about the importance of error terms in the regression model.
  5. What are the parameters in a simple linear regression model?
  6. What is the objective of estimating the parameters of a simple linear regression model?
itfeature.com statistics help

visit and learn R Programming Language