Linear Algebra/Eigenvalues and Eigenvectors
In this subsection we will focus on the property of Corollary 2.4.
- Definition 3.1
A transformation has a scalar eigenvalue if there is a nonzero eigenvector such that .
("Eigen" is German for "characteristic of" or "peculiar to"; some authors call these characteristic values and vectors. No authors call them "peculiar".)
- Example 3.2
The projection map
has an eigenvalue of associated with any eigenvector of the form
where and are scalars at least one of which is non-. On the other hand, is not an eigenvalue of since no non- vector is doubled.
That example shows why the "non-" appears in the definition. Disallowing as an eigenvector eliminates trivial eigenvalues.
- Example 3.3
The only transformation on the trivial space is
.
This map has no eigenvalues because there are no non- vectors mapped to a scalar multiple of themselves.
- Example 3.4
Consider the homomorphism given by . The range of is one-dimensional. Thus an application of to a vector in the range will simply rescale that vector: . That is, has an eigenvalue of associated with eigenvectors of the form where .
This map also has an eigenvalue of associated with eigenvectors of the form where .
- Definition 3.5
A square matrix has a scalar eigenvalue associated with the non- eigenvector if .
- Remark 3.6
Although this extension from maps to matrices is obvious, there is a point that must be made. Eigenvalues of a map are also the eigenvalues of matrices representing that map, and so similar matrices have the same eigenvalues. But the eigenvectors are different— similar matrices need not have the same eigenvectors.
For instance, consider again the transformation given by . It has an eigenvalue of associated with eigenvectors of the form where . If we represent with respect to
then is an eigenvalue of , associated with these eigenvectors.
On the other hand, representing with respect to gives
and the eigenvectors of associated with the eigenvalue are these.
Thus similar matrices can have different eigenvectors.
Here is an informal description of what's happening. The underlying transformation doubles the eigenvectors . But when the matrix representing the transformation is then it "assumes" that column vectors are representations with respect to . In contrast, "assumes" that column vectors are representations with respect to . So the vectors that get doubled by each matrix look different.
The next example illustrates the basic tool for finding eigenvectors and eigenvalues.
- Example 3.7
What are the eigenvalues and eigenvectors of this matrix?
To find the scalars such that for non- eigenvectors , bring everything to the left-hand side
and factor . (Note that it says ; the expression doesn't make sense because is a matrix while is a scalar.) This homogeneous linear system
has a non- solution if and only if the matrix is singular. We can determine when that happens.
The eigenvalues are and . To find the associated eigenvectors, plug in each eigenvalue. Plugging in gives
for a scalar parameter ( is non- because eigenvectors must be non-). In the same way, plugging in gives
with .
- Example 3.8
If
(here is not a projection map, it is the number ) then
so has eigenvalues of and . To find associated eigenvectors, first plug in for :
for a scalar , and then plug in :
where .
- Definition 3.9
The characteristic polynomial of a square matrix is the determinant of the matrix , where is a variable. The characteristic equation is . The characteristic polynomial of a transformation is the polynomial of any .
Problem 11 checks that the characteristic polynomial of a transformation is well-defined, that is, any choice of basis yields the same polynomial.
- Lemma 3.10
A linear transformation on a nontrivial vector space has at least one eigenvalue.
- Proof
Any root of the characteristic polynomial is an eigenvalue. Over the complex numbers, any polynomial of degree one or greater has a root. (This is the reason that in this chapter we've gone to scalars that are complex.)
Notice the familiar form of the sets of eigenvectors in the above examples.
- Definition 3.11
The eigenspace of a transformation associated with the eigenvalue is . The eigenspace of a matrix is defined analogously.
- Lemma 3.12
An eigenspace is a subspace.
- Proof
An eigenspace must be nonempty— for one thing it contains the zero vector— and so we need only check closure. Take vectors from , to show that any linear combination is in
(the second equality holds even if any is since ).
- Example 3.13
In Example 3.8 the eigenspace associated with the eigenvalue and the eigenspace associated with the eigenvalue are these.
- Remark 3.15
The characteristic equation is so in some sense is an eigenvalue "twice". However there are not "twice" as many eigenvectors, in that the dimension of the eigenspace is one, not two. The next example shows a case where a number, , is a double root of the characteristic equation and the dimension of the associated eigenspace is two.
- Example 3.16
With respect to the standard bases, this matrix
represents projection.
Its eigenspace associated with the eigenvalue and its eigenspace associated with the eigenvalue are easy to find.
By the lemma, if two eigenvectors and are associated with the same eigenvalue then any linear combination of those two is also an eigenvector associated with that same eigenvalue. But, if two eigenvectors and are associated with different eigenvalues then the sum need not be related to the eigenvalue of either one. In fact, just the opposite. If the eigenvalues are different then the eigenvectors are not linearly related.
- Theorem 3.17
For any set of distinct eigenvalues of a map or matrix, a set of associated eigenvectors, one per eigenvalue, is linearly independent.
- Proof
We will use induction on the number of eigenvalues. If there is no eigenvalue or only one eigenvalue then the set of associated eigenvectors is empty or is a singleton set with a non- member, and in either case is linearly independent.
For induction, assume that the theorem is true for any set of distinct eigenvalues, suppose that are distinct eigenvalues, and let be associated eigenvectors. If then after multiplying both sides of the displayed equation by , applying the map or matrix to both sides of the displayed equation, and subtracting the first result from the second, we have this.
The induction hypothesis now applies: . Thus, as all the eigenvalues are distinct, are all . Finally, now must be because we are left with the equation .
- Example 3.18
The eigenvalues of
are distinct: , , and . A set of associated eigenvectors like
is linearly independent.
- Corollary 3.19
An matrix with distinct eigenvalues is diagonalizable.
- Proof
Form a basis of eigenvectors. Apply Corollary 2.4.
Exercises
edit- Problem 1
For each, find the characteristic polynomial and the eigenvalues.
- This exercise is recommended for all readers.
- Problem 2
For each matrix, find the characteristic equation, and the eigenvalues and associated eigenvectors.
- Problem 3
Find the characteristic equation, and the eigenvalues and associated eigenvectors for this matrix. Hint. The eigenvalues are complex.
- Problem 4
Find the characteristic polynomial, the eigenvalues, and the associated eigenvectors of this matrix.
- This exercise is recommended for all readers.
- Problem 5
For each matrix, find the characteristic equation, and the eigenvalues and associated eigenvectors.
- This exercise is recommended for all readers.
- Problem 6
Let be
Find its eigenvalues and the associated eigenvectors.
- Problem 7
Find the eigenvalues and eigenvectors of this map .
- This exercise is recommended for all readers.
- Problem 8
Find the eigenvalues and associated eigenvectors of the differentiation operator .
- Problem 9
- Prove that
the eigenvalues of a triangular matrix (upper or lower triangular) are the entries on the diagonal.
- This exercise is recommended for all readers.
- Problem 10
Find the formula for the characteristic polynomial of a matrix.
- Problem 11
Prove that the characteristic polynomial of a transformation is well-defined.
- This exercise is recommended for all readers.
- Problem 12
- Can any non- vector in any nontrivial vector space be a eigenvector? That is, given a from a nontrivial , is there a transformation and a scalar such that ?
- Given a scalar , can any non- vector in any nontrivial vector space be an eigenvector associated with the eigenvalue ?
- This exercise is recommended for all readers.
- Problem 13
Suppose that and . Prove that the eigenvectors of associated with are the non- vectors in the kernel of the map represented (with respect to the same bases) by .
- Problem 14
Prove that if are all integers and then
has integral eigenvalues, namely and .
- This exercise is recommended for all readers.
- Problem 15
Prove that if is nonsingular and has eigenvalues then has eigenvalues . Is the converse true?
- This exercise is recommended for all readers.
- Problem 16
Suppose that is and are scalars.
- Prove that if has the eigenvalue with an associated eigenvector then is an eigenvector of associated with eigenvalue .
- Prove that if is diagonalizable then so is .
- This exercise is recommended for all readers.
- Problem 17
Show that is an eigenvalue of if and only if the map represented by is not an isomorphism.
- Problem 18
- Show that if is an eigenvalue of then is an eigenvalue of .
- What is wrong with this proof generalizing that? "If is an eigenvalue of and is an eigenvalue for , then is an eigenvalue for , for, if and then "?
- Problem 19
Do matrix-equivalent matrices have the same eigenvalues?
- Problem 20
Show that a square matrix with real entries and an odd number of rows has at least one real eigenvalue.
- Problem 21
Diagonalize.
- Problem 22
Suppose that is a nonsingular matrix. Show that the similarity transformation map sending is an isomorphism.
- ? Problem 23
Show that if is an square matrix and each row (column) sums to then is a characteristic root of . (Morrison 1967)
References
edit- Morrison, Clarence C. (proposer) (1967), "Quickie", Mathematics Magazine, 40 (4): 232.
- Strang, Gilbert (1980), Linear Algebra and its Applications (Second ed.), Harcourt Brace Jovanovich.