Simple Linear Regression analysis is based on the linear relation between the response variable Y and a single predictor variable X. The model investigated is Y = α + βx + ε, where ε is called the error which refers to factors that contribute to Y value other than X. A linear regression line is used when the linear

correlation coefficient calculated for the sample data is high enough and its significance accepted by a Hypothesis test. The first step in the analysis is to find the equation to the line of best fit. There are many methods to find the line of best fit. But the Least Square Regression line is accepted as the reliable tool to be used in prediction and forecasting.

Least square regression line is got by minimizing the squared deviations of all the data points from the fitting line.

Statistically the line of best fit is written in the form Y' = a + bX the equivalent form to a linear equation Y = mX + b.

The values of a, the Y' intercept and b the slope are found using formulas similar to the one used for finding the correlation coefficient r for a sample data.

a =

$\frac{(\sum y)(\sum x^{2})-(\sum x)(\sum xy)}{n(\sum x^{2})-(\sum x)^{2}}$b =

$\frac{n(\sum xy)-(\sum x)(\sum y)}{n(\sum x^{2})-(\sum x)^{2}}$The slope b can also be defined as b =

$\frac{S_{x,y}}{S_{x}^{2}}$ where S

_{x,y} is the sample covariance between x and y and $S_{x}^{2}$ is the

sample variance of x.

The Y intercept a can also be defined as a = y - bx where x and y are correspondingly the sample means of x group and the y group.Here the response variable is denoted by Y' as estimate to distinguish from the actual value Y.The regression line is used for estimation purposes after testing the significance of its slope and intercept.