To get solutions to the first more difficult partial differential equations (like, for example, Poisson's equation, the heat equation and a more general version of the transport equation), we will now set up the theory of distributions. Distributions are functions which map a function to a real number.

## The dual spaceEdit

For each topological vector space, we can define a dual space.

**Definition 3.1**:

Let be a topological vector space. The set

is called the **dual space of **.

## DistributionsEdit

### Definition: DistributionsEdit

Let be a function space with a notion of convergence. A **distribution** is a mapping with two properties:

- is linear
- is continuous; i. e. if in the notion of convergence of the function space, then it must follow that in the ordinary notion of convergence in the real numbers known from first semester Analysis (i. e. )

If is the space of the bump functions, we call a distribution a distribution (because usually distributions are distributions with the bump functions as function space). If however is the space of Schwartz distributions, then we call a distribution a *tempered distribution*.

### Theorem 1.2Edit

Let be an arbitrary sequence of bump functions. If with respect to the notion of convergence for bump functions, then also with respect to the notion of convergence for Schwartz functions.

*Proof:*

Let be the compact set in which all the are contained. In , ‘compact’ is the same as ‘bounded and closed’. Therefore, for some . Then we have for every multiindices that

due to the definition of convergence for bump functions. Therefore the sequence converges with respect to the notion of convergence for Schwartz functions.

### ExamplesEdit

An example for a distribution is the dirac delta distribution for an , which is defined by

for functions .

### Regular distributionsEdit

Let be a function and be a function space, where denotes the set of the essentially bounded functions (i. e. the functions which are below a certain constant exept for a Lebesgue nullset). Then we can define a mapping as follows:

We call a distribution a *regular distribution*, if and only if there is a function such that .

### Theorem 1.3Edit

The following three claims are true:

- If is an integrable function and , where the inverse of the embedding is continuous, then as defined above is a distribution.
- If is a locally integrable function, is a domain and , then as defined above is a distribution.
- If and , then as defined above is a distribution.

*Proof*:

1) The linearity is due to the linearity of the integral. Well-definedness follows from the calculation

Since the inverse of the embedding is continuous, we have

Therefore, continuity follows from

2) The proof follows by observing that , since is bounded, and that the notion of convergence in requires that if , then there exists a compact set such that , and then performing almost the same calculations as above.

3) Due to the triangle inequality for integrals and Hölder's inequality, we have

But we furthermore have

If in the notion of convergence of the Schwartz function space, then this expression goes to zero. Therefore, continuity is verified. Linearity again follows by the properties of the integral. Well-definedness follows from

### Distribution spacesEdit

If is a function space of functions defined on with a notion of convergence, then the set of all distributions on this space is usually denoted with . This set is also called a "distribution space". It is the dual space of .

### Theorem 1.4Edit

*Proof*: Let , let be a convergent sequence of bump functions with their limit, and let be two bump functions.

Theorem 1.1 gives us that are Schwartz functions.

Theorem 1.2 gives us that in the sense of Schwartz functions.

From these two statements we can conclude due to , that .

Theorem 1.1 tells us furthermore that are Schwartz functions. From this we can conclude due to that .

This completes the proof.

## Operations on DistributionsEdit

### Lemma 1.5Edit

Let be function spaces, and be a linear function.

If there exists a linear operator , which is sequentially continuous^{[1]}, and it holds that:

*Then*, under these conditions, we may define the operator

, which really maps to , and for regular distributions and it will have the property

*Proof*: Well-definedness follows from the fact that is a function of due to the first requirement on . Linearity follows from the linearity of and linearity of :

Continuity follows just the same way from continuity of and : Let w.r.t. the notion of conv. of . Then

The property

follows directly from the equation

- .

### Multiplication by a smooth functionEdit

Let be a smooth function ("smooth" means it is often differentiable). Then, by defining and , we meet the requirements of the above lemma and may define multiplication of distributions by smooth functions as follows:

- Let , then

### DifferentiationEdit

For the bump functions and the Schwartz functions, we also may define the differentiation of distributions. Let and . Let's now define

- .

Then, for the spaces or , the requirements for the above lemma 1.4 are met and we may define the differentiation of distribution in the following way:

This definition also satisfies .

*Proof*: By integration by parts, we obtain:

, where is the i-th component of the outward normal vector and is the boundary of . For bump functions, the boundary integral vanishes anyway, because the functions in are zero there. For Schwartz functions, we may use the identity

and the decreasing property of the Schwartz functions to see that the boundary integral goes to zero and therefore

To derive the equation

, we may apply the formula from above several times. This finishes the proof, because this equation was the only non-trivial property of , which we need for applying lemma 1.5.

### Push-ForwardEdit

Let , be domains, and let be a smooth function from to , such that for all compact subsets , is compact. Then we call the function

the *pull-back* of bump functions.

If we choose , i. e. is a smooth function from to such that for all compact sets , is compact, then we also define the pull-back of Schwartz functions just exactly the same way:

For bump functions and Schwartz functions, we may define the *push-forward*:

For the bump functions

or, for Schwartz functions:

### ConvolutionEdit

Let , and let . Let's define

- .

This function () is linear, because the integral is linear. It is called the *convolution* of and .

We can also define: , and:

By the theorem of Fubini, we can calculate as follows:

Therefore, the first assumption for Lemma 1.5 holds.

Due to the Leibniz integral rule, we obtain that for (i. e. is integrable) and (i. e. the partial derivatives of exist up to order and are also continuous):

- ,

With this formula, we can see (due to the monotony of the integral) that

From this follows sequential continuity for Schwartz and bump functions by defining and . Thus, with the help of lemma 1.5, we can define the convolution with a distribution of or as follows:

## NotesEdit

- ↑ This means in this case that if with respect to the notion of convergence of , then must also w.r.t. (="with respecct to") the notion of convergence of

## ExercisesEdit

- Show that endowed with the usual topology is a topological vector space.

## SourcesEdit

- Rudin, Walter (1991).
*Functional Analysis*(2nd ed.). McGraw-Hill. ISBN 9780070542365.