Real Analysis/Differentiation in Rn

Real Analysis
Differentiation in Rn

We will first revise some important concepts of Linear Algebra that are of importance in Multivariate Analysis. The reader with no background in Linear Algebra is advised to refer the book Linear Algebra.

Vector Space

edit

A set   is said to be a Vector Space over a field   if and only if operations addition and scalar multiplication are defined over it so as to satisfy for all   and  

(i)Commutativity: 

(ii)Associativity: 

(iii)Identity:There exists   such that  

(iv)Inverse:There exists   such that  

(v): 

(vi) 

(vii) 

Members of a vector space are called "Vectors" and those of the field are called "Scalars".  , the set of all polynomials etc. are examples of vector spaces

A set of linearly independant vectors that spans the vector space is said to be a Basis for the vector space.

Linear Transformations

edit

Let   be vector spaces.

Let  

We say that   is a Linear transformation if and only if for all  ,

(i) 

(ii) 


As we will see, there are two major ways to define a 'derivative' of a multivariable function. We first present the seemingly more straightforward way of using "Partial Derivatives".

Directional and Partial Derivatives

edit

Let  

Let  

We say that   is differentiable at   with respect to vector   if and only if there exists   that satisfies

 

  is said to be the derivative of   at   with respect to   and is written as  

When   is a unit vector, the derivative is said to be a partial derivative. Here we will explicitly define partial derivatives and see some of their properties.


Let   be a real multivariate function defined on an open subset   of  

 .

Then the partial derivative at some parameter   with respect to the coordinate   is defined as the following limit

 .

  is said to be differentiable at this parameter   if the difference   is equivalent up to first order in h to a linear form L (of h), that is

 

The linear form L is then said to be the differential of   at  , and is written as   or sometimes  .

In this case, where   is differentiable at  , by linearity we can write

 

  is said to be continuously differentiable if its differential is defined at any parameter in its domain, and if the differential is varying continuously relative to the parameter  , that is if it coordinates (as a linear form)   are varying continuously.

In case partial derivatives exists but   is not differentiable, and sometimes not even continuous exempli gratia

 

(and  ) we say that   is separably differentiable.

Total Derivatives

edit

The total derivative is important as it preserves some of the key properties of the single variable derivative, most notably the assertion differentiability implies continuity

Let  

We say that   is differentiable at   if and only if there exists a linear transformation,  , called the derivative or total derivative of   at  , such that

 

One should read   as the linear transformation   applied to the vector  . Sometimes it is customary to write this as  .


Theorem

edit

Suppose   is an open set and   is differentiable on A. Think of writing   in components so  . Then the partial derivatives   exist, and the matrix representing the linear transformation   with respect to the standard bases of   and   is given by the Jacobian Matrix:

 

evaluated at  .

NOTE: This theorem requires the function to be differentiable to begin with. It is a common mistake to assume that if the partial derivatives exist then this would imply that the function is differentiable because we can construct the Jacobian matrix. This however is completely false. Which brings us to the next theorem:

Theorem

edit

Suppose   is an open set and  . Think of writing   in components so  . If   exists and is continuous on   for all   and for all  , then   is differentiable on  .


This theorem gives us a nice criteria for a function to be differentiable.