# Convolution theorem

In mathematics, the convolution theorem states that under suitable conditions the Fourier transform of a convolution is the point-wise product of Fourier transforms. In other words, convolution in one domain (i.e. time domain) equals point-wise multiplication in the other domain (i.e. frequency domain). Versions of the convolution theorem are true for various Fourier-related transforms.

Let f and g be two functions with convolution f * g. (Note that the asterisk denotes convolution in this context, and not multiplication.) Let ${\mathcal {F}}$ denote the Fourier transform operator, so ${\mathcal {F}}[f]$ and ${\mathcal {F}}[g]$ are the Fourier transforms of f and g, respectively. Then

${\mathcal {F}}[f*g]={\sqrt {2\pi }}({\mathcal {F}}[f])\cdot ({\mathcal {F}}[g])$ where · denotes point-wise multiplication. It also works "the other way round":

${\mathcal {F}}[f\cdot g]={\frac {{\mathcal {F}}[f]*{\mathcal {F}}[g]}{\sqrt {2\pi }}}$ By applying the inverse Fourier transform ${\mathcal {F}}^{-1}$ , we can write:

$f*g={\sqrt {2\pi }}{\mathcal {F}}^{-1}[{\mathcal {F}}[f]\cdot {\mathcal {F}}[g]]$ This theorem also holds for the Laplace transform and two-sided Laplace transform, and when suitably modified for the Mellin transform and Hartley transform (see Mellin inversion theorem). It can be extended to the Fourier transform of abstract harmonic analysis defined over locally compact abelian groups.

This formulation is especially useful for implementing a numerical convolution on a computer: The standard convolution algorithm has quadratic computational complexity. With the help of the convolution theorem and the fast Fourier transform, the complexity of the convolution can be reduced to O(n log n). This can be exploited to construct fast multiplication algorithms.