Tagged: Pearson’s Correlation Coefficient

Pearson Correlation Coefficient use, Interpretation, Properties

The correlation coefficient or Pearson’s Correlation Coefficient was originated by Karl Pearson in the 1900s. The Pearson’s Correlation Coefficient is a measure of the (degree of) strength of the linear relationship between two continuous random variables denote by $\rho_{XY}$ for population and for sample it is denoted by $r_{XY}$.

The Correlation coefficient can take values that occur in the interval [1,-1]. If the coefficient value is 1 or -1, there will be a perfect linear relationship between the variables. A positive sign with a coefficient value shows a positive (direct, or supportive), while a negative sign with a coefficient value shows the negative (indirect, opposite) relationship between the variables. The zero-value implies the absence of a linear relation and it also shows that variables are independent. Zero value also shows that there may be some other sort of relationship between the variables of interest such as a systematic or circular relationship between the variables.

Correlation coefficient

Mathematically, if two random variables such as $X$ and $Y$ follow an unknown joint distribution then the simple linear correlation coefficient is equal to covariance between $X$ and $Y$ divided by the product of their standard deviations i.e

\[\rho=\frac{Cov(X, Y)}{\sigma_X \sigma_Y}\]

where $Cov(X, Y)$ is a measure of covariance between $X$ and $Y$, $\sigma_X$ and $\sigma_Y$ are the respective standard deviation of the random variables.

For a sample of size $n$, $(X_1, Y_1),(X_2, Y_2),\cdots,(X_n, Y_n)$ from the joint distribution, the quantity given bellow is an estimate of $\rho$, called sampling correlation and denoted by r.

\begin{eqnarray*}
r&=&\frac{\sum_{i=1}^{n}(X_i-\bar{X})(Y_i-\bar{Y})}{\sqrt{\sum_{i=1}^{n}(X_i-\bar{X})^2 \times \sum_{i=1}^{n}(Y_i-\bar{Y})^2}}\\
&=& \frac{Cov(X,Y)}{S_X  X_Y}
\end{eqnarray*}

Note that

  • The existence of a statistical correlation does not mean that there exists a cause and effect relation between the variables. Cause and effect mean that change in one variable does cause a change in the other variable.
  • The changes in the variables may be due to a common cause or random variations.
  • There are many kinds of correlation coefficients. The choice of which to use for a particular set of data depends on different factors such as
    • Type of Scale (Measurement Scale) used to express the variables
    • Nature of the underlying distribution (continuous or discrete)
    • Characteristics of the distribution of the scores (linear or non-linear)
  • Correlation is perfectly linear if a constant change in $X$ is accompanied by a constant change in $Y$. In this case, all the points in the scatter diagram will lie in a straight line.
  • A high correlation coefficient does not necessarily imply a direct dependence of the variables. For example, there may be a high correlation between the number of crimes and shoe prices. Such a kind of correlation referred to as non-sense or spurious correlations.

Properties of the Correlation Coefficient

  1. The correlation coefficient is symmetrical with respect to $X$ and $Y$ i.e. $r_{XY}=r_{YX}$.
  2. The Correlation coefficient is a pure number and it does not depend upon the units in which the variables are measure.
  3. The correlation coefficient is the geometric mean of the two regression coefficients. Thus if the two regression lines of $Y$ on $X$ and $X$ on $Y$ are written as $Y=a+bX$ and $X=c+dy$ respectively then $bd=r^2$.
  4. The correlation coefficient is independent of the choice of origin and scale of measurement of the variables, i.e. $r$ remains unchanged if constants are added to or subtracted from the variables and if the variables having the same size are multiplied or divided by the class interval size.
  5. The correlation coefficient lies between -1 and +1, symbolically $-1\le r \le 1$.

Pearson’s Correlation Coefficient SPSS

Pearson’s Correlation Coefficient SPSS

The Pearson’s correlation or correlation coefficient or simply correlation  is used to find the degree of linear relationship between two continuous variables. The value for a correlation coefficient lies between 0.00 (no correlation) and 1.00 (perfect correlation). Generally, correlations above 0.80 are considered pretty high.

Remember:

  1. Correlation is interdependence of continuous variables, it does not refer to any cause and effect.
  2. Correlation is used to determine linear relationship between variables.
  3. Draw a scatter plot before performing/calculating the correlation (to check the assumptions of linearity)

How to Correlation Coefficient in SPSS

The command for correlation is found at Analyze –> Correlate –> Bivariate i.e.

Analyze-->Correlate-->Bivariate...

The Bivariate Correlations dialog box will be there:

Correlation dialog box in spss

Select one of the variables that you want to correlate in the left hand pane of the Bivariate Correlations dialog box and shift it into the Variables pane on the right hand pan by clicking the arrow button. Now click on the other variable that you want to correlate in the left hand pane and move it into the Variables pane by clicking on the arrow button

Bivariate correlation box

Output

output from correlation test

The Correlations table in output gives the values of the specified correlation tests, such as Pearson’s correlation. Each row of the table corresponds to one of the variables similarly each column also corresponds to one of the variables.

Interpreting Correlation Coefficient

In example, the cell at the bottom row of the right column represents the correlation of depression with depression having the correlation equal to 1.0. Likewise the cell at the middle row of the middle column represents the correlation of anxiety with anxiety having correlation value This in in both cases shows that anxiety is related with anxiety similarly depression is related to depression, so have perfect relationship.

The cell at middle row and right column (or cell at the bottom row at the middle column) is more interesting. This cell represents the correlation of anxiety and depression (or depression with anxiety). There are three numbers in these cells.

  1. The top number is the correlation coefficient value which is 0.310.
  2. The middle number is the significance of this correlation which is 0.018.
  3. The bottom number, 46 is the number of observations that were used to calculate the correlation coefficient. between the variable of study.

Note that the significance tells us whether we would expect a correlation that was this large purely due to chance factors and not due to an actual relation. In this case, it is improbable that we would get an r (correlation coefficient) this big if there was not a relation between the variables.

 

x Logo: Shield Security
This Site Is Protected By
Shield Security