Last modified on 20 December 2012, at 22:20

Ordinary Differential Equations/Linear Systems

A system of differential equations is a collection of two or more differential equations, which each ODE may depend upon the other unknown function.

For example consider the equations:

\begin{cases}x'(t)=2x(t)+y(t)\\ y'(t)=3y(t)\end{cases}

In this case the equation for differential equation for x'(t) depends on both x(t) and y(t). In principle we could also allow y'(t) to depend on both x and y, but it is not necessary.

Notice in some cases we find a solution for a system of ODE's. For example in the case above, because y' doesn't depend on x we can solve the second equation (by separating variables or using an integrating factor) to get that y=C_2e^{3t}. Since there will be a second constant when we solve the first ODE, we choose to call the constant here C_2. Now we can plug this into the first equation to get that: x'=2x+C_2e^{3t}. We can solve this equation by using an integrating factor to get that:

\begin{cases}x(t)=C_1e^{2t}+C_2e^{3t}\\y(t)=C_2e^{3t}\end{cases}

In other cases a clever change of variables allows one to separate the two ODE's. Consider the system

\begin{cases}x_1'(t)=4x_1(t)+2x_2(t)\\ x_2'(t)=2x_1(t)+4x_2(t)\end{cases}.

If we let y_1=x_1+x_2 and y_2=x_1-x_2. Then we find that

\begin{cases}y_1'(t)=6y_1(t)\\ y_2'(t)=2y_2(t)\end{cases}

and each of these are easy to solve: y_1=C_1e^{6t} and y_2=C_2e^{2t}. And so we find x_1=C_1e^{6t}+C_2e^{2t} and x_1=C_1e^{6t}-C_2e^{2t}. It turns out to be helpful with systems to work with vectors and matrices so if we introduce \textstyle\vec{x}(t)=\begin{pmatrix}x_1(t)\\x_2(t)\end{pmatrix}. Then the above system can be re-written as:

\frac{d}{dt}\vec{x}(t)=\begin{pmatrix}4 & 2 \\ 2 & 4\end{pmatrix}\vec{x}(t).

And we have solutions \vec{x}_1(t)=C_1e^{6t}\begin{pmatrix}1\\1\end{pmatrix} and \vec{x}_2(t)=C_2e^{2t}\begin{pmatrix}1\\-1\end{pmatrix}

Notice that the solutions we found were of the for e^{\lambda t}\vec{\xi} for some constant vector \vec{\xi}. Using this as motivation we will investigate the question, when does \vec{x}(t)=e^{\lambda t}\vec{\xi} solve the system:

\frac{d}{dt}\vec{x}(t)=A\vec{x}(t).

for some constant matrix A.

By substituting into the equation we see that:

\begin{align}
\frac{d}{dt}(e^{\lambda t}\vec{\xi})&=A(e^{\lambda t}\vec{\xi})\\
\lambda e^{\lambda t}\vec{\xi}&=e^{\lambda t}A\vec{\xi}\\
e^{\lambda t}(A-\lambda I)\vec{\xi}&=\vec{0}.
\end{align}

Since e^{\lambda t}\neq 0, the only way for the left hand side to be \vec{0} is if \lambda is an eigenvalue and \vec{\xi} is a corresponding eigenvector.

This is not quite the end of the story. When the matrix is real we shall consider the following cases:

Real Distinct EigenvaluesEdit

Complex EigenvaluesEdit

Real Repeated EigenvaluesEdit