Correlation Analysis using Pearson’s Correlation Coefficient, Spearman’s Rank correlation, Measure of the linear relationship between quantitative variables, strength, and direction of relationship, Coefficient of correlation, real-life examples of correlation
Consider the following data for the illustration of the detection of heteroscedasticity using the Spearman Rank correlation test. The Data file is available to download.
The value of $t$ from the table at a 5% level of significance at 8 degrees of freedom is 2.306.
Since $t_{cal} \ngtr t_{tab}$, there is no evidence of the systematic relationship between the explanatory variables, $X_2$, and the absolute value of the residuals ($|u_i|$), and hence,e there is no evidence of heteroscedasticity.
Since there is more than one regressor (the example is from the multiple regression model), therefore, Spearman’s Rank Correlation test should be repeated for each of the explanatory variables.
As an assignment, perform the Spearman Rank Correlation between |$u_i$| and $X_3$Â for the data above. Test the statistical significance of the coefficient in the above manner to explore evidence about heteroscedasticity.
Covariance and correlation are very important terminologies in statistics. Covariance measures the degree to which two variables co-vary (i.e., vary/change together). If the greater values of one variable (say, $X_i$) correspond with the greater values of the other variable (say, $X_j$), i.e., if the variables tend to show similar behavior, then the covariance between two variables ($X_i$, $X_j$) will be positive.
Table of Contents
Similarly, if the smaller values of one variable correspond with the smaller values of the other variable, then the covariance between the two variables will be positive. In contrast, if the greater values of one variable (say, $X_i$) mainly correspond to the smaller values of the other variables (say, $X_j$), i.e., both of the variables tend to show opposite behavior, then the covariance will be negative.
In other words, positive covariance between two variables means they (both of the variables) vary/change together in the same direction relative to their expected values (averages). It means that if one variable moves above its average value, the other variable tends to be above its average value.
Similarly, if the covariance is negative between the two variables, then one variable tends to be above its expected value, while the other variable tends to be below its expected value. If covariance is zero then it means that there is no linear dependency between the two variables.
Mathematical Representation of Covariance
Mathematically covariance between two random variables $X_i$ and $X_j$ can be represented as \[COV(X_i, X_j)=E[(X_i-\mu_i)(X_j-\mu_j)]\] where $\mu_i=E(X_i)$ is the average of the first variable $\mu_j=E(X_j)$ is the average of the second variable
Note that the covariance of a random variable with itself is the variance of the random variable, i.e. $COV(X_i, X_i)=VAR(X)$. If $X_i$ and $X_j$ are independent, then $E(X_i X_j)=E(X_i)E(X_j)$ and $COV(X_i, X_j)=E(X_i X_j)-E(X_i) E(X_j)=0$.
Covariance and Correlation
Correlation and covariance are related measures but not equivalent statistical measures.
Equation of Correlation (Normalized Covariance)
The correlation between two variables (Let, $X_i$ and $X_j$) is their normalized covariance, defined as \begin{aligned} \rho_{i,j}&=\frac{E[(X_i-\mu_i)(X_j-\mu_j)]}{\sigma_i \sigma_j}\\ &=\frac{n \sum XY – \sum X \sum Y}{\sqrt{(n \sum X^2 -(\sum X)^2)(n \sum Y^2 – (\sum Y)^2)}} \end{aligned} where $\sigma_i$ is the standard deviation of $X_i$ and $\sigma_j$ is the standard deviation of $X_j$.
Note that correlation is dimensionless, i.e. a number that is free of the measurement unit and its values lie between -1 and +1 inclusive. In contrast, covariance has a unit of the product of the units of two variables.
When to Use Covariance and Correlation
The covariance and correlation should be used as described below:
Covariance: Useful in portfolio theory (finance).
Correlation: Preferred in most cases (e.g., psychology, medicine, ML) due to standardized interpretation.
For example, the correlation between study hours & exam scores can be used to measure the strength of the relationship (e.g.,$ r = 0.7$ shows a strong positive link between study hours and exam scores).
Similarly, the Covariance between stock returns Helps in diversification.
The Sign of Covariance
The Sign Matters covariance matters:
Positive Covariance: Variables move together (↑X → ↑Y).
The value of covariance depends on units (for example, covariance of “hours vs. scores” $\ne$ “minutes vs. scores”). For unitless measures, use correlation for standardized interpretation.
For further reading about Correlation, follow these postsThe
The coefficient of correlation (r) measures the strength and direction of a linear relationship between two variables. In this post, we will discuss about coefficient of correlation and the coefficient of determination.
Table of Contents
Correlation Coefficient Ranges
The correlation coefficient ranges from -1 to +1, where a value of +1 indicates the perfect positive correlation (as one variable increases, the other increases proportionally), the -1 value indicates the perfect negative correlation (as one variable increases, the other decreases proportionally), and the value of 0 indicates no linear correlation (no relationship between the variables).
The coefficient of correlation values between -1 and +1 indicates the degree of strength and direction of relationship:
The strength of correlation depends on the absolute value of r:
Range of Correlation Value
Interpretation
0.90 to 1.00
Very strong correlation
0.70 to 0.89
Strong correlation
0.40 to 0.69
Moderate correlation
0.10 to 0.39
Weak correlation
0.00 to 0.09
No or negligible correlation
The closer the value of the correlation coefficient is to ±1, the stronger the linear relationship.
Coefficient of Determination
We know that the ratio of the explained variation to the total variation is called the coefficient of determination, which is the square of the Correlation Coefficient Range and lies between $-1$ and $+1$. This ratio (coefficient of determination) is non-negative; therefore, denoted by $r^2$, thus
It can be seen that if the total variation is all explained, the ratio $r^2$ (Coefficient of Determination) is one, and if the total variation is all unexplained, then the explained variation and the ratio $r^2$ are zero.
The square root of the coefficient of determination is called the correlation coefficient, given by
We also know that the variance of any random variable is $\ge 0$, it could be zero i.e. $(Var(X)=0)$if and only if $X$ is a constant (almost surely), therefore
\[V(X^* \pm Y^*)=V(X^*)+V(Y^*)\pm2Cov(X^*,Y^*)\]
As $Var(X^*)=1$ and $Var(Y^*)=1$, the above equation would be negative if $Cov(X^*,Y^*)$ is either greater than 1 or less than -1. Hence \[1\geq \rho(X,Y)=\rho(X^*,Y^*)\geq -1\].
If $\rho(X,Y )=Cov(X^*,Y^*)=1$ then $Var(X^*- Y ^*)=0$ making $X^* = Y^*$ almost surely. Similarly, if $\rho(X,Y )=Cov(X^*,Y^*)=-1$ then $X^* = – Y^*$ almost surely. In either case, $Y$ would be a linear function of $X$ almost surely.