Engineering Analysis/Random Vectors

Many of the concepts that we have learned so far have been dealing with random variables. However, these concepts can all be translated to deal with vectors of random numbers. A random vector X contains N elements, Xi, each of which is a distinct random variable. The individual elements in a random vector may or may not be correlated or dependent on one another.

Expectation edit

The expectation of a random vector is a vector of the expectation values of each element of the vector. For instance:

 

Using this definition, the mean vector of random vector X, denoted μX is the vector composed of the means of all the individual elements of X:

 

Correlation Matrix edit

The correlation matrix of a random vector X is defined as:

 

Where each element of the correlation matrix corresponds to the correlation between the row element of X, and the column element of XT. The correlation matrix is a real-symmetric matrix. If the off-diagonal elements of the correlation matrix are all zero, the random vector is said to be uncorrelated. If the R matrix is an identity matrix, the random vector is said to be "white". For instance, "white noise" is uncorrelated, and each element of the vector has an equal correlation value.

Matrix Diagonalization edit

As discussed earlier, we can diagonalize a matrix by constructing the V matrix from the eigenvectors of that matrix. If X is our non-diagonal matrix, we can create a diagonal matrix D by:

 

If the X matrix is real symmetric (as is always the case with the correlation matrix), we can simplify this to be:

 

Whitening edit

A matrix can be whitened by constructing a matrix W that contains the inverse squareroots of the eigenvalues of X on the diagonal:

 

Using this W matrix, we can convert X into the identity matrix:

 

Simultaneous Diagonalization edit

If we have two matrices, X and Y, we can construct a matrix A that will satisfy the following relationships:

 
 

Where I is an identity matrix, and D is a diagonal matrix. This process is known as simultaneous diagonalization. If we have the V and W matrices described above such that

 ,

We can then construct the B matrix by applying this same transformation to the Y matrix:

 

We can combine the eigenvalues of B into a transformation matrix Z such that:

 

We can then define our A matrix as:

 
 

This A matrix will satisfy the simultaneous diagonalization procedure, outlined above.

Covariance Matrix edit

The Covariance Matrix of two random vectors, X and Y, is defined as:

 

Where each element of the covariance matrix expresses the variance relationship between the row element of X, and the column element of Y. The covariance matrix is real symmetric.

We can relate the correlation matrix and the covariance matrix through the following formula:

 

Cumulative Distribution Function edit

An N-vector X has a cumulative distribution function Fx of N variables that is defined as:

 

Probability Density Function edit

The probability density function of a random vector can be defined in terms of the Nth partial derivative of the cumulative distribution function:

 

If we know the density function, we can find the mean of the ith element of X using N-1 integrations: