Autocorrelation is the correlation of a time series with its past and future values. A statistical method used for time series analysis and is commonly known as serious correlation or lagged correlation. At different points in time and for a random process it describes the correlation between values of the process. In the year 1976, Box and Jenkins founded autocorrelation. It can be used to detect non randomness in data and also to find an appropriate time series model when the data is not random. For predictions it can be easily explained as it depends on past and future values and the tools used for assessing autocorrelation are time series plot, lagged scatter plot and the autocorrelation function. A mathematical tool for finding repeated patterns due to the noise present in the data.

Autocorrelations can be of two types :
 1) Positive
 2) Negative.
Positive autocorrelation:
Consecutive errors usually have the same sign and positive residuals are almost followed by negative residuals.
Positive autocorrelations tend to make the estimate of the error variance too small. In this case the null hypothesis is rejected with a higher probability than the stated significance level and the confidence intervals are too narrow.

Negative autocorrelation:
Consecutive errors typically have opposite signs and negative residuals are followed by positive residuals
Negative autocorrealtion tends to make the estimate of error variance too large and so the confidence intervals become too wide. In this case the power of significance tests is reduced.

While finding positive and negative autocorrelation, least squares parameter estimates are usually not as efficient as generalized least squares parameter estimates.
Properties of autocorrelation are listed below:
1) Autocorrelation function is bounded, -1 $\leq$ $\rho_{x}$(l) $\leq$ 1 and has white noise, x(n) $\sim$ WN ($\mu_{x}$, $\sigma^{2}_{x}$) : $\rho _{x}(l)$ = $\delta (l)$

2) Helps in assigning meaning to estimated values from signals.

3) For the autocovariance function $\gamma$ of a stationary time series {X$_{t}$} we have

$\gamma (0)$ $\geq$ 0
|$\gamma(h)$|   $\leq$   $\gamma (0)$
$\gamma (h)$ = $\gamma (-h)$
$\gamma$ is positive semi definite

4)  All auto-correlation functions are even functions
5)  A time shift of a signal does not make any change of its auto-correlation.
Spatial Autocorrelation:
Correlation of a variable with itself through space is known as spatial autocorrelation. If there is some systematic pattern in the spatial distribution of a variable it is said to be spatially autocorrelated and if the neighbouring areas are alike then it is positive spatial autocorrelation.
If random patterns exhibit then there is no spatial autocorrelation.
Usually occurs when the relative outcomes of two points is related to their distance. When analyzing spatial data, it is important to check for autocorrelation.

Partial Autocorrelation:


Partial autocorrelation, at lag k, this is the correlation between series values that are k intervals apart, accounting for the values of the intervals between. It is a conditional correlation and it is found by correlating the residuals from two different regressions. Using this one can find the appropriate lags and this plays an important role in data analyses for identifying the extent of the lag in an autoregressive model.
Measures the correlation between observations at different times and is the normalized autocovariance function.
Given measurements, Y$_{1}$, Y$_{2}$, ........., Y$_{N}$ at time X$_{1}$, X$_{2}$, ........., X$_{N}$, the lag k autocorrelation function is defined as
 
r$_{k}$ = $\frac{\sum_{i = 1}^{N-K}(Y_{i}-\bar{Y})(Y_{i+k}-\bar{Y})}{\sum_{i = 1}^{N}(Y_{i}-\bar{Y})^{2}}$

Autocorrelation coefficients are arranged as a function of separation in time known as the sample autocorrelation coefficient or the autocorrelation function. it is commonly also known as correlogram.For a wide sense stationarity signal it is defined as:

$\rho_{x}(l)$ = $\frac{\gamma _{x}(l)}{\gamma _{x}(0)}$

= $\frac{\gamma _{x}(l)}{\sigma^{2} _{x}}$

where $\gamma _{xx}(l)$ is the autocovariance of x(n).

$\gamma _{xx}(l)$ = E[[x(n+l)-$\mu_{x}$][x(n)-$\mu_{x}$]$]$

= r$_{x}(l)- |\mu_{x}|^{2}$