# Differential operator

Jump to navigation Jump to search

In mathematics, a differential operator is a linear operator defined as a function of the differentiation operator. It is helpful, as a matter of notation first, to consider differentiation as an abstract operation, accepting a function and returning another (in the style of a higher-order function in computer science).

## Notations

The most commonly used differential operator is the action of taking the derivative itself. Common notations for this operator include:

${d \over dx}$ $D,\,$ where the variable one is differentiating to is clear, and
$D_{x},\,$ where the variable is declared explicitly.

First derivatives are signified as above, but when taking higher, n-th derivatives, the following alterations are useful:

$d^{n} \over dx^{n}$ $D^{n}\,$ $D_{x}^{n}.\,$ The D notation's use and creation is credited to Oliver Heaviside, who considered differential operators of the form

$\sum _{k=0}^{n}c_{k}D^{k}$ in his study of differential equations.

One of the most frequently seen differential operators is the Laplacian operator, defined by

$\Delta =\nabla ^{2}=\sum _{k=1}^{n}{\partial ^{2} \over \partial x_{k}^{2}}.$ Another differential operator is the Θ operator, defined by

$\Theta =z{d \over dz}.$ ## Adjoint of an operator

Given a linear differential operator

$Tu=\sum _{k=0}^{n}a_{k}(x)D^{k}u$ the adjoint of this operator is defined as the operator $T^{*}$ such that

$\langle u,Tv\rangle =\langle T^{*}u,v\rangle$ where the notation $\langle ,\rangle$ is used for the scalar product or inner product. This definition therefore depends on the definition of the scalar product. In the functional space of square integrable functions, the scalar product is defined by

$\langle f,g\rangle =\int _{a}^{b}f^{*}(x)g(x)\,dx$ .

If one moreover adds the condition that f and g vanish for $x\to a$ and $x\to b$ , one can also define the adjoint of T by

$T^{*}u=\sum _{k=0}^{n}(-1)^{k}D^{k}[a_{k}(x)u]$ .

This formula does not explicitly depend on the definition of the scalar product. It is therefore sometimes chosen as a definition of the adjoint operator. When $T^{*}$ is defined according to this formula, it is called the formal adjoint of T.

A self-adjoint operator is an operator adjoint of itself.

The Sturm-Liouville operator is a well-known example of formal self-adjoint operator. This second order linear differential operators L can be written in the form

$Lu=-(pu')'+qu=-(pu''+p'u')+qu=-pu''-p'u'+qu=(-p)D^{2}u+(-p')Du+(q)u\;\!$ This property can be proven using the formal adjoint definition above.

${\begin{matrix}L^{*}u&=&(-1)^{2}D^{2}[(-p)u]+(-1)^{1}D[(-p')u]+(-1)^{0}(qu)\\&=&D^{2}(-pu)-D(-p'u)+qu\\&=&-D^{2}(pu)+D(p'u)+qu\\&=&-(pu)''+(p'u)'+qu\\&=&-(p'u+pu')'+(p''u+p'u')+qu\\&=&-(p'u)'-(pu')'+p''u+p'u'+qu\\&=&-(p''u+p'u')-(p'u'+pu'')+p''u+p'u'+qu\\&=&-p''u-p'u'-p'u'-pu''+p''u+p'u'+qu\\&=&-p''u+p''u-p'u'-p'u'+p'u'-pu''+qu\\&=&-p'u'-pu''+qu\\&=&-(pu')'+qu&=&Lu\\\end{matrix}}$ This operator is central to Sturm-Liouville theory where the eigenfunctions (analogues to eigenvectors) of this operator are considered.

## Properties of differential operators

Differentiation is linear, i.e.,

$D(f+g)=(Df)+(Dg)$ $D(af)=a(Df)$ where f and g are functions, and a is a constant.

Any polynomial in D with function coefficients is also a differential operator. We may also compose differential operators by the rule

(D1oD2)(f) = D1 [D2(f)].

Some care is then required: firstly any function coefficients in the operator D2 must be differentiable as many times as the application of D1 requires. To get a ring of such operators we must assume derivatives of all orders of the coefficients used. Secondly, this ring will not be commutative: an operator gD isn't the same in general as Dg. In fact we have for example the relation basic in quantum mechanics:

DxxD = 1.

The subring of operators that are polynomials in D with constant coefficients is, by contrast, commutative. It can be characterised another way: it consists of the translation-invariant operators.

## Several variables

The same constructions can be carried out with partial derivatives, differentiation with respect to different variables giving rise to operators that commute (see symmetry of second derivatives).

## Coordinate-independent description

In differential geometry and algebraic geometry it is often convenient to have a coordinate-independent description of differential operators between two vector bundles. Let E and F be two vector bundles over a manifold M. An operator is a mapping of sections, P: Γ(E) → Γ(F) which maps the stalk of the sheaf of germs of Γ(E) at a point xM to the fibre of F at x:

Γx(E) → Fx .

An operator P is said to be a k-th order differential operator if it factors through the jet bundle Jk(E). In other words, there exists a linear mapping of vector bundles

iP : Jk(E) → F

such that P = iP o jk as in the following composition:

P : Γx(E) → Jk(E)xFx .

A foundational result and characterization is the Peetre theorem.