Engineering Analysis/Eigenvalues and Eigenvectors

The Eigen Problem

edit

This page is going to talk about the concept of Eigenvectors and Eigenvalues, which are important tools in linear algebra, and which play an important role in State-Space control systems. The "Eigen Problem" stated simply, is that given a square matrix A which is n × n, there exists a set of n scalar values λ and n corresponding non-trivial vectors v such that:

 

We call λ the eigenvalues of A, and we call v the corresponding eigenvectors of A. We can rearrange this equation as:

 

For this equation to be satisfied so that v is non-trivial, the matrix (A - λI) must be singular. That is:

 

Characteristic Equation

edit

The characteristic equation of a square matrix A is given by:


[Characteristic Equation]

 

Where I is the identity matrix, and λ is the set of eigenvalues of matrix A. From this equation we can solve for the eigenvalues of A, and then using the equations discussed above, we can calculate the corresponding eigenvectors.

In general, we can expand the characteristic equation as:


[Characteristic Polynomial]

 

This equation satisfies the following properties:

  1.  
  2. A is nonsingular if c0 is non-zero.

Example: 2 × 2 Matrix

edit

Let's say that X is a square matrix of order 2, as such:

 

Then we can use this value in our characteristic equation:

 
 

The roots to the above equation (the values for λ that satisfies the equality) are the eigenvalues of X.

Eigenvalues

edit

The solutions, λ, of the characteristic equation for matrix X are known as the eigenvalues of the matrix X.

Eigenvalues satisfy the following properties:

  1. If λ is an eigenvalue of A, λn is an eigenvalue of An.
  2. If λ is a complex eigenvalue of A, then λ* (the complex conjugate) is also an eigenvalue of A.
  3. If any of the eigenvalues of A are zero, then A is singular. If A is non-singular, all the eigenvalues of A are nonzero.

Eigenvectors

edit

The characteristic equation can be rewritten as such:

 

Where X is the matrix under consideration, and λ are the eigenvalues for matrix X. For every unique eigenvalue, there is a solution vector v to the above equation, known as an eigenvector. The above equation can also be rewritten as:

 

Where the resulting values of v for each eigenvalue λ is an eigenvector of X. There is a unique eigenvector for each unique eigenvalue of X. From this equation, we can see that the eigenvectors of A form the nullspace:

 

And therefore, we can find the eigenvectors through row-reduction of that matrix.

Eigenvectors satisfy the following properties:

  1. If v is a complex eigenvector of A, then v* (the complex conjugate) is also an eigenvector of A.
  2. Distinct eigenvectors of A are linearly independent.
  3. If A is n × n, and if there are n distinct eigenvectors, then the eigenvectors of A form a complete basis set for  

Generalized Eigenvectors

edit

Let's say that matrix A has the following characteristic polynomial:

 

Where d1, d2, ... , ds are known as the algebraic multiplicity of the eigenvalue λi. Also note that d1 + d2 + ... + ds = n, and s < n. In other words, the eigenvalues of A are repeated. Therefore, this matrix doesnt have n distinct eigenvectors. However, we can create vectors known as generalized eigenvectors to make up the missing eigenvectors by satisfying the following equations:

 
 

Right and Left Eigenvectors

edit

The equation for determining eigenvectors is:

 

And because the eigenvector v is on the right, these are more appropriately called "right eigenvectors". However, if we rewrite the equation as follows:

 

The vectors u are called the "left eigenvectors" of matrix A.