# Autocorrelation

Autocorrelation is a mathematical tool used frequently in signal processing for analysing functions or series of values, such as time domain signals. It is the cross-correlation of a signal with itself. Autocorrelation is useful for finding repeating patterns in a signal, such as determining the presence of a periodic signal which has been buried under noise, or identifying the fundamental frequency of a signal which doesn't actually contain that frequency component, but implies it with many harmonic frequencies.

## Definitions

Different definitions of autocorrelation are in use depending on the field of study which is being considered and not all of them are equivalent. In some fields, the term is used interchangeably with autocovariance.

### Statistics

In statistics, the autocorrelation of a discrete time series or a process Xt is simply the correlation of the process against a time-shifted version of itself. If Xt is second-order stationary with mean μ then this definition is

$\displaystyle R(k) = \frac{E[(X_i - \mu)(X_{i+k} - \mu)]}{\sigma^2}$

where E is the expected value and k is the time shift being considered (usually referred to as the lag). This function has the attractive property of being in the range [−1, 1] with 1 indicating perfect correlation (the signals exactly overlap when time shifted by k) and −1 indicating perfect anti-correlation. It is common practice in many disciplines to drop the normalisation by σ2 and use the term autocorrelation interchangeably with autocovariance.

### Signal processing

In signal processing, given a signal f(t), the continuous autocorrelation Rf(τ) is the continuous cross-correlation of f(t) with itself, at lag τ, and is defined as:

$\displaystyle R_f(\tau) = f^*(-\tau) \circ f(\tau) = \int_{-\infty}^{\infty} f(t+\tau)f^*(t)\, dt = \int_{-\infty}^{\infty} f(t)f^*(t-\tau)\, dt$

where f* represents the complex conjugate and the circle represents convolution. For a real function, f* = f.

Formally, the discrete autocorrelation R at lag j for signal xn is

$\displaystyle R(j) = \sum_n (x_n-m)(x_{n-j}-m )\,$

where m is the average value (expected value) of xn. Quite frequently, autocorrelations are calculated for zero-centered signals, that is, for signals with zero mean. The autocorrelation definition then becomes

$\displaystyle R(j) = \sum_n x_n x_{n-j}.\,$

Multi-dimensional autocorrelation is defined similarly. For example, in three dimensions the autocorrelation would be defined as

$\displaystyle R(j,k,\ell) = \sum_{n,q,r} (x_{n,q,r}-m)(x_{n-j,q-k,r-\ell}-m).$

## Properties

In the following, we will describe properties of one-dimensional autocorrelations only, since most properties are easily transferred from the one-dimensional case to the multi-dimensional cases.

• A fundamental property of the autocorrelation is symmetry, R(i) = R(−i), which is easy to prove from the definition. In the continuous case, the autocorrelation is an even function
$\displaystyle R_f(-\tau) = R_f(\tau)\,$
when f is a real function, and an Hermitian function
$\displaystyle R_f(-\tau) = R_f^*(\tau)\,$
when f is a complex function.
• The continuous autocorrelation function reaches its peak at the origin, where it takes a real value, i.e. for any delay τ, $\displaystyle |R_f(\tau)| \leq R_f(0)$ . This is a consequence of the Cauchy-Schwarz inequality. The same result holds in the discrete case.
• The autocorrelation of a periodic function is, itself, periodic with the very same period.
• The autocorrelation of the sum of two completely uncorrelated functions (the cross-correlation is zero for all τ) is the sum of the autocorrelations of each function separately.
• Since autocorrelation is a specific type of cross-correlation, it maintains all the properties of cross-correlation.
• The autocorrelation of a white noise signal will have a strong peak at τ = 0 and will be close to 0 for all other τ. This shows that a sampled instance of a white noise signal is not statistically correlated to a sample instance of the same white noise signal at another time.
$\displaystyle R(\tau) = \int_{-\infty}^\infty S(f) e^{j 2 \pi f \tau} \, df$
$\displaystyle S(f) = \int_{-\infty}^\infty R(\tau) e^{- j 2 \pi f \tau} \, d\tau.$

## Applications

• In optics, normalized autocorrelations and cross-correlations give the degree of coherence of an electromagnetic field.