ODE0.2

From Example Problems
Jump to: navigation, search

Describe Picard Iteration.

Theorem- Let f(x,t)\, be continuous for

|t-t_{0}|\leq \alpha ,|x-x_{0}|\leq \beta \,

and satisfy a Lipschitz condition with constant L\, in this region. Let |f(x,t)|\leq M\, there and let \delta =\min \left\{\alpha ,\beta /M\right\}\,. Then the initial value problem

x(t)=x_{0}+\int _{{t_{0}}}^{t}f(x(s),s)ds\,

has a unique solution for |t-t_{0}|\leq \delta \,.

Proof- The uniqueness of the solution has already been established in ODE0.3. The existence of the solution is proved in three steps.

(i) Define the following sequence of iterates:

x^{0}(t)=x_{0}\,

x^{1}(t)=x_{0}+\int _{{t_{0}}}^{t}f(x^{0}(s),s)ds\,

...

x^{{N+1}}(t)=x_{0}+\int _{{t_{0}}}^{t}f(x^{N}(s),s)ds\,


The first approximation x^{0}\, is made by neglecting the integral. The next x^{1}\, is made by using x^{0}\, in the intgrand as a correction term.

In order for the iterates to be defined, it is required to prove that each iterate stays within its domain of definition of f(x,t)\,. It will be shown that the iterates are defined for |t-t_{0}|\leq \delta \, and satisfy the inequalities

|x^{{N+1}}(t)-x_{0}|\leq \beta ({\mathrm  {for}}|t-t_{0}|\leq \delta )\,

The first step of induction is easy:

|x^{{0}}(t)-x_{0}|=0\leq \beta ({\mathrm  {for}}|t-t_{0}|\leq \delta )\,

The induction hypothesis is to assume that

|x^{r}(t)-x_{0}|\leq \left|\int _{{t_{0}}}^{t}|f(x^{N}(s),s)|ds\right|\,

\leq M|t-t_{0}|\,

\leq \beta ({\mathrm  {for}}|t-t_{0}|\leq \delta )\,

The completes the induction.

(ii)

Next it is shown that the sequence of iterates is uniformly and absolutely convergent. From the general term of the iteration it is clear that

x^{{N+1}}(t)-x^{N}(t)=\int _{{t_{0}}}^{t}\left\{f(x^{N}(s),s)-f(x^{{N-1}}(s),s)\right\}ds\,

Take the norm of both sides and use the Lipschitz condition.

|x^{{N+1}}(t)-x^{N}(t)|\leq L{\Bigg |}\int _{{t_{0}}}^{t}{\Big |}x^{N}(s)-x^{{N-1}}Big|ds{\Bigg |}\,

This produces a sequence of inequalities.

From part (i), |x^{1}(t)-x^{0}(t)|\leq M|t-t_{0}|\,.

The induction hypothesis is

|x^{r}(t)-x^{{r-1}}(t)|\leq ML^{{r-1}}{\frac  {|t-t_{0}|^{r}}{r!}}(r=1,...,N)\,

Now it follows that

|x^{{N+1}}(t)-x^{N}(t)|\leq {\frac  {ML^{N}}{N!}}{\Bigg |}\int _{{t_{0}}}^{t}|s-t_{0}|^{N}ds{\Bigg |}\,

=ML^{N}{\frac  {|t-t_{0}|^{{N+1}}}{(N+1)!}}\,

Now x^{N}(t)\, can be written as the partial sum

x^{0}(t)+\sum _{{r=1}}^{N}\left\{x^{r}(t)-x^{{r-1}}(t)\right\}=x^{N}(t)\,

But the series on the left is domainated term-by-term by the series

x_{0}+\sum _{{r=1}}^{N}ML^{{r-1}}{\frac  {|t-t_{0}|^{r}}{r!}}\,

As m\to \infty \,, this series is uniformly and asymptotically convergent to

x_{0}+{\frac  {M}{L}}\left[\exp\{L|t-t_{0}|\}-1\right]\,

So by the comparison test, x^{N}(t)\, comverges uniformly for |t-t_{0}|\leq \delta \, to a limit function x(t)\,.

(iii)

Again from the general term of the iteration, it can be seen that each iterate is a continuous function of t\, for |t-t_{0}|\leq \delta \,. Hence, since x^{N}(t)\, converges uniformly to x(t)\,, it follows that x(t)\, is a continuous function of t\, for |t-t_{0}|\leq \delta \,.

It follows from |x^{{N+1}}(t)-x_{0}|\leq \beta ({\mathrm  {for}}|t-t_{0}|\leq \delta )\, that as N\to \infty \,,

|x(t)-x_{0}|\leq \beta ({\mathrm  {for}}|t-t_{0}|\leq \delta )\,

It remains to show that x(t)\, satisfies the IVP x(t)=x_{0}+\int _{{t_{0}}}^{t}f(x(s),s)ds\,.

Let N\to \infty \, in the general iteration term.

x^{{N+1}}(t)\to x(t)\,

and

\int _{{t_{0}}}^{t}f(x^{N}(s),s)ds\to \int _{{t_{0}}}^{t}f(x(s),s)ds\,

Finally using the Lipschitz condition,

{\Bigg |}\int _{{t_{0}}}^{t}\left\{f(x(s),s)-f(x^{N}(s),s)\right\}ds{\Bigg |}\leq {\Bigg |}\int _{{t_{0}}}^{t}L|x(s)-x^{N}(s)|ds{\Bigg |}\,

\leq L\delta \max _{{|t-t_{0}|\leq \delta }}|x(t)-x^{N}(t)|\,

and the right hand side tends to zero as N\to \infty \,.


Main Page : Ordinary Differential Equations