**Pearson Correlation Coefficient**

**Pearson Correlation Coefficient**

The correlation coefficient or Pearson’s Correlation Coefficient was originated by Karl Pearson in 1900’s. Correlation coefficient is a measure of the (degree of) strength of the linear relationship between two continuous random variables denote by *ρ _{XY}* for population and for sample it is denoted by

*r*.

_{XY}Correlation coefficient can take values that occur in the interval *[1,-1]*. If coefficient values is 1 or -1, there will be perfect linear relationship between the variables. Positive sign with coefficient value shows positive (direct, or supportive), while negative sign with coefficient value show negative (indirect, opposite) relationship between the variables. The Zero value implies the absence of a linear linear relation and it also shows that variables are independent. Zero value also shows that there may be some other sort of relationship between the variables of interest such as systematic or circular relation between the variables.

Mathematically, if two random variables such as *X* and *Y* follow an unknown joint distribution then the simple linear correlation coefficient is equal to covariance between *X* and *Y* divided by the product of their standard deviations i.e

\[\rho=\frac{Cov(X, Y)}{\sigma_X \sigma_Y}\]

where *Cov(X, Y)* is measure of covariance between *X* and *Y*, σ_{X}and σ_{Y }are the respective standard deviation of the random variables.

For a sample of size *n*, *(X _{1}, Y_{1}),(X_{2}, Y_{2}),…,(X_{n}, Y_{n})* from the joint distribution, the quantity given bellow is an estimate of

*ρ*, called sampling correlation and denoted by

*r*.

\begin{eqnarray*}

r&=&\frac{\sum_{i=1}^{n}(X_i-\bar{X})(Y_i-\bar{Y})}{\sqrt{\sum_{i=1}^{n}(X_i-\bar{X})^2 \times \sum_{i=1}^{n}(Y_i-\bar{Y})^2}}\\

&=& \frac{Cov(X,Y)}{S_X X_Y}

\end{eqnarray*}

Note that

- The existence of a statistical correlation does not means that there exists a cause and effect relation between the variables. Cause and effect means that change in one variable does cause a change in the other variable.
- The changes in the variables may be due to a common cause or random variations.
- There are many kind of correlation coefficient. The choice of which to use for a particular set of data depends on different factors such as
- Type of Scale (Measurement Scale) used to express the variables
- Nature of the underlying distribution (continuous or discrete)
- Characteristics of the distribution of the scores (linear or non linear)

- Correlation is perfect linear if a constant change in
*X*is accompanied by a constant change in*Y*. In this case all the points in scatter diagram will lie on a straight line. - High correlation coefficient does not necessarily imply a direct dependence of the variables. For example there may be a high correlation between number of crimes and shoe prices. Such kind of correlation referred as non-sense or spurious correlations.

**Properties of the Correlation Coefficient**

- The correlation coefficient is symmetrical with respect to
*X*and*Y*i.e.*r*_{XY}=r_{YX} - The Correlation coefficient is a pure number and it does not depend upon the units in which the variables are measure.
- The correlation coefficient is the geometric mean of the two regression coefficients. Thus if the two regression lines of
*Y*on*X*and*X*on*Y*are written as*Y=a+bx*and*X=c+dy*respectively then*bd=r*^{2}. - The correlation coefficient is independent of the choice of origin and scale of measurement of the variables, i.e.
*r*remains unchanged if constants are added to or subtracted from the variables and if the variables having same size are multiplied or divided by the class interval size. - The correlation coefficient lies between -1 and +1, symbolically
*-1≤r≤1**.*

**Download pdf:**
**Linear Correlation Coefficient 148.71 KB**

**Linear Correlation Coefficient 148.71 KB**