Famous Theorems of Mathematics/Algebra/Linear Transformations

Lemma for the eigenspace

edit

All eigenvectors of the linear transformation A that correspond to the eigenvalue λ form a subspace L(λ) in L.

Proof by Shilov (1969)

edit

In fact, if Ax1 = λx1, and Ax2 = λx2, then

Ax1 + βx2) = αAx1 + βAx2 = αλx1 + βλx2 = λ (αx1 + βx2)

with which the statement in the lemma is proven.

Lemma for linear independence of eigenvectors

edit

Eigenvectors x1, x2, ... , xn of the (linear) transformation A with respective pairwise distinct eigenvalues λ1, λ2, ... , λn, are linearly independent.

Proof by Shilov (1969)

edit

This statement is proved by induction to number n. It is obvious that for n = 1 the lemma is true. Suppose that the lemma is true for all n – 1 eigenvalues of the transformation A; it remains to show that it is true for all n eigenvectors of the transformation A. Suppose a linear combination of n eigenvectors of the transformation A is 0:

α1x1 + α2x2 + ... + αnxn = 0.

Applying transformation A to this identity, one has

α1λ1x1 + α2λ2x2 + ... + αnλnxn = 0.

Multiply the first equation by λn and subtract from the second one; one obtains

α11 – λn)x1 + α22 – λn)x2 + ... + αn – 1n – 1 – λn)xn – 1 = 0,

from where by induction all coefficients must be zero. Distinct eigenvalues have nonzero difference, so each αi = 0 for i < n; the first equation reduces to

αnxn = 0

which means αn = 0, too. Consequently, all coefficients αi are 0. Therefore, the vectors x1, x2, ..., xn are linearly independent.