Consider a scenario where the matrix representation of a system A differs from the actual implementation of the system by a factor of ΔA. In other words, our system uses the matrix:

From the study of Control Systems, we know that the values of the eigenvectors can affect the stability of the system. For that reason, we would like to know how a small error in A will affect the eigenvalues.

First off, we assume that ΔA is a *small* shift. The definition of "small" in this sense is arbitrary, and will remained open. Keep in mind that the techniques discussed here are more accurate the smaller ΔA is.

If ΔA is the error in the matrix A, then Δλ is the error in the eigenvalues and Δv is the error in the eigenvectors. The characteristic equation becomes:

We have an equation now with two unknowns: Δλ and Δv. In other words, we don't know how a small change in A will affect the eigenvalues and eigenvectors. If we multiply out both sides, we get:

This situation seems hopeless, until we multiply both sides by the corresponding left-eigenvector w from the left:

Terms where two Δs (which are known to be small, by definition) are multiplied together, we can say are negligible, and ignore them. Also, we know from our right-eigenvalue equation that:

Another fact is that the right-eigenvectors and left eigenvectors are orthogonal to each other, so the following result holds:

Substituting these results, where necessary, into our long equation above, we get the following simplification:

And solving for the change in the eigenvalue gives us:

This approximate result is only good for small values of ΔA, and the result is less precise as the error increases.