# Ordinary Differential Equations/One-dimensional first-order linear equations

## Definition

One-dimensional first-order inhomogenous linear ODEs are ODEs of the form

$x'(t)+f(t)x(t)=g(t)$

for suitable (that is, mostly, continuous) functions $f,g:\mathbb {R} \to \mathbb {R}$ ; note that when $g\equiv 0$ , we have a homogenous equation instead.

## General solution

First we note that we have the following superposition principle: If we have a solution $x_{h}$  ("$h$ " standing for "homogenous") of the problem

$x_{h}'(t)+f(t)x_{h}(t)=0$

(which is nothing but the homogenous problem associated to the above ODE) and a solution to the actual problem $x_{p}$ ; that is a function $x_{p}$  such that

$x_{p}'(t)+f(t)x_{p}(t)=g(t)$

("$p$ " standing for "particular solution", indicating that this is only one of the many possible solutions), then the function

$x(t):=ax_{h}(t)+x_{p}(t)$  ($a\in \mathbb {R}$  arbitrary)

still solves $x'(t)+f(t)x(t)=g(t)$ , just like the particular solution $x_{p}$  does. This is proved by computing the derivative of $x$  directly.

In order to obtain the solutions to the ODE under consideration, we first solve the related homogenous problem; that is, first we look for $x_{h}$  such that

$x_{h}'(t)+f(t)x_{h}(t)=0\Leftrightarrow x_{h}'=-f(t)x_{h}$ .

It may seem surprising, but this gives actually a very quick path to the general solution, which goes as follows. Separation of variables (and using $\ln ^{-1}=\exp$ ) gives

$x_{h}(t)=\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)$ ,

since the function

$G(t):=-\int _{t_{0}}^{t}f(s)ds$

is an antiderivative of $t\mapsto -f(t)$ . Thus we have found the solution to the related homogenous problem.

For the determination of a solution $x_{p}$  to the actual equation, we now use an Ansatz: Namely we assume

$x_{p}(t)=c(t)x_{h}(t)$ ,

where $c:\mathbb {R} \to \mathbb {R}$  is a function. This Ansatz is called variation of the constant and is due to Leonhard Euler. If this equation holds for $x_{p}$ , let's see what condition on $c$  we get for $x_{p}$  to be a solution. We want

$x_{p}'(t)+f(t)x_{p}(t)=g(t)$ , that is (by the product rule and inserting $x_{h}$ ):
$c'(t)\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)=c'(t)\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)+c(t)(-1)\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)f(t)+f(t)c(t)\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)=g(t)$ .

Putting the exponential on the other side, that is

$c'(t)=g(t)\exp \left(\int _{t_{0}}^{t}f(s)ds\right)$

or

$c(t)=\int _{t_{0}}^{t}g(r)\exp \left(\int _{t_{0}}^{r}f(s)ds\right)dr+C_{1}$ .

Since all the manipulations we did are reversible, all functions of the form

$C_{2}\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)+\left(\int _{t_{0}}^{t}g(r)\exp \left(\int _{t_{0}}^{r}f(s)ds\right)dr+C_{1}\right)\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)$  ($C_{1},C_{2}\in \mathbb {R}$  arbitrary)

are solutions. If we set $C:=C_{2}+C_{1}$ , we get the general solution form

$C\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)+\left(\int _{t_{0}}^{t}g(r)\exp \left(\int _{t_{0}}^{r}f(s)ds\right)dr\right)\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)$ .

We want now to prove that these constitute all the solutions to the equation under consideration. Thus, set

$x_{C}(t):=C\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)+\left(\int _{t_{0}}^{t}g(r)\exp \left(\int _{t_{0}}^{r}f(s)ds\right)dr\right)\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)$

and let $x_{2}(t)$  be any other solution to the inhomogenous problem under consideration. Then $x_{C}-x_{2}$  solves the homogenous problem, for

$x_{C}'(t)-x_{2}'(t)-f(t)(x_{C}(t)-x_{2}(t))=x_{C}'(t)-f(t)x_{C}(t)-(x_{2}'(t)-f(t)x_{2}(t))=g(t)-g(t)=0$ .

Thus, if we prove that all the homogenous solutions (and in particular the difference $x_{C}-x_{2}$ ) are of the form

$C\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)$ ,

then we may subtract

$D\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)$

from $x_{C}-x_{2}$  for an appropriate $D\in \mathbb {R}$  to obtain zero, which is why $x_{2}$  is then of the desired form.

Thus, let $x_{h}$  be any solution to the homogenous problem. Consider the function

$t\mapsto x_{h}(t)\cdot \exp \left(\int _{t_{0}}^{t}f(s)ds\right)$ .

We differentiate this function and obtain by the product rule

$x_{h}'(t)\exp \left(\int _{t_{0}}^{t}f(s)ds\right)+f(t)\exp \left(\int _{t_{0}}^{t}f(s)ds\right)x_{h}(t)=-f(t)x_{h}(t)\exp \left(\int _{t_{0}}^{t}f(s)ds\right)+f(t)\exp \left(\int _{t_{0}}^{t}f(s)ds\right)x_{h}(t)=0$

since $x_{h}$  is a solution to the homogenous problem. Hence, the function is constant (that is, equal to a constant $C\in \mathbb {R}$ ), and solving

$x_{h}(t)\cdot \exp \left(\int _{t_{0}}^{t}f(s)ds\right)=C$

for $x_{h}$  gives the claim.

We have thus arrived at:

Theorem 3.1:

For continuous $f,g:\mathbb {R} \to \mathbb {R}$ , the solutions to the ODE

$x'(t)+f(t)x(t)=g(t)$

are precisely the functions

$x(t)=C\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)+\left(\int _{t_{0}}^{t}g(r)\exp \left(\int _{t_{0}}^{r}f(s)ds\right)dr\right)\exp \left(-\int _{t_{0}}^{t}f(s)ds\right)$  ($C\in \mathbb {R}$  arbitrary).

Note that imposing a condition $x(t_{0})=x_{0}$  for some $x_{0}\in \mathbb {R}$  enforces $C=x_{0}$ , whence we got a unique solution for each initial condition.

### Exercises

• Exercise 3.2.1: First prove that ${\frac {d}{dt}}\ln(t^{2})={\frac {2}{t}}$ . Then solve the ODE $x'(t)+{\frac {2}{t}}x(t)={\frac {1}{t^{2}}}$  for a function existent on $[1,\infty )$  such that $x(1)=c$  for $c\in \mathbb {R}$  arbitrary. Use that a similar version of theorem 3.1 holds when $f,g$  are only defined on a proper part of $\mathbb {R}$ ; this is because the proof carries over.

## Clever Ansatz for polynomial RHS

First note that RHS means "Right Hand Side". Let's consider the special case of a 1-dim. first-order linear ODE

$x'(t)+cx(t)=a_{i}t^{i}$  ($c\in \mathbb {R}$  arbitrary),

where we used Einstein summation convention; that is, $a_{i}x^{i}$  stands for $\sum _{i=0}^{m}a_{i}t^{i}$  for some $m\in \mathbb {N}$ . In the notation of above, we have $f(t)\equiv c$  and $g(t)=a_{i}t^{i}$ .

Using separation of variables, the solution to the corresponding homogenous problem $g\equiv 0$  is easily seen to equal $x_{h}(t)=C\exp(-ct)$  for some capital $C\in \mathbb {R}$ .

To find a particular solution $x_{p}$ , we proceed as follows. We pick the Ansatz to assume that $x_{p}$  is simply a polynomial; that is

$x_{p}(t)=b_{i}t^{i}$

for certain coefficients $b_{i}$ .

### Exercises

• Exercise 3.3.1: Find all solutions to the ODE $x'(t)+2x(t)=2t^{2}+4t+3$ . (Hint: What does theorem 3.1 say about the number of solutions to that problem with a given fixed initial condition?)

Example 3.2: