This subsection is optional. It consists of proofs of two results from the prior subsection. These proofs involve the properties of permutations, which will not be used later, except in the optional Jordan Canonical Form subsection.
The prior subsection attacks the problem of showing that
for any size there is
a determinant function on the set of square matrices of that size
by using multilinearity to develop the
This reduces the problem to showing that there is a determinant
function on the set of permutation matrices of that size.
Of course, a permutation matrix can be row-swapped to the identity matrix
and to calculate its determinant we can keep track of the number of row swaps.
However, the problem is still not solved.
We still have not shown that the result is well-defined.
For instance, the determinant of
could be computed with one swap
or with three.
have an odd number of swaps so we figure that
but how do we know that there isn't some way to do it with an even number of
Corollary 4.6 below proves
that there is no permutation matrix
that can be row-swapped to an identity matrix in two ways, one with
an even number of swaps and the other with an odd number of swaps.
- Definition 4.1
Two rows of a permutation matrix
such that are in an inversion of their natural order.
- Example 4.2
This permutation matrix
has three inversions: precedes , precedes , and precedes .
- Lemma 4.3
A row-swap in a permutation matrix changes the number of inversions from even to odd, or from odd to even.
Consider a swap of rows and , where .
If the two rows are adjacent
then the swap changes the total number of inversions by one —
either removing or producing one inversion, depending on whether
or not, since inversions involving rows not in this pair
are not affected.
Consequently, the total number of inversions changes from odd to even
or from even to odd.
If the rows are not adjacent then they can be swapped
via a sequence of adjacent swaps, first bringing row up
and then bringing row down.
Each of these adjacent swaps changes the number of inversions from odd to even or from even to odd. There are an odd number of them. The total change in the number of inversions is from even to odd or from odd to even.
- Definition 4.4
The signum of a permutation is if the number of inversions in is even, and is if the number of inversions is odd.
- Example 4.5
With the subscripts from Example 3.8 for the -permutations, while .
- Corollary 4.6
If a permutation matrix has an odd number of inversions then swapping it to the identity takes an odd number of swaps. If it has an even number of inversions then swapping to the identity takes an even number of swaps.
The identity matrix has zero inversions. To change an odd number to zero requires an odd number of swaps, and to change an even number to zero requires an even number of swaps.
We still have not shown that the permutation expansion is
well-defined because we have not considered row operations
on permutation matrices other than row swaps.
We will finesse this problem: we will define a function
by altering the permutation expansion formula, replacing
(this gives the same value as the permutation expansion
because the prior result shows that ).
This formula's advantage is that the number of inversions
is clearly well-defined — just count them.
Therefore, we will show that a determinant function exists
for all sizes by showing that is it, that is, that
satisfies the four conditions.
- Lemma 4.7
The function is a determinant. Hence determinants exist for every .
We'll must check that it has the four properties
from the definition.
Property (4) is easy; in
all of the summands are zero except for the product down the diagonal,
which is one.
For property (3) consider where
Factor the out of each term to get the desired equality.
For (2), let
To convert to unhatted 's,
for each consider the permutation that
equals except that the
-th and -th numbers are interchanged,
Replacing the in
with this gives
(by Lemma 4.3)
and so we get
where the sum is over all permutations derived from
another permutation by a swap of the -th and
But any permutation can be derived from some other permutation by such a swap,
in one and only one way,
so this summation is in fact a sum over all permutations,
taken once and only once.
To do property (1) let
, not ).
Distribute, commute, and factor.
We finish by showing that the terms add to zero. This sum represents where is a matrix equal to except that row of is a copy of row of (because the factor is , not ). Thus, has two equal rows, rows and . Since we have already shown that changes sign on row swaps, as in Lemma 2.3 we conclude that .
We have now shown that determinant functions exist for each size.
We already know that for each size there is at most one
Therefore, the permutation expansion
computes the one and only determinant value of a square matrix.
We end this subsection by proving the other result
remaining from the prior subsection, that the determinant of a matrix
equals the determinant of its transpose.
- Example 4.8
Writing out the permutation expansion of the general
matrix and of its transpose, and
comparing corresponding terms
(terms with the same letters)
shows that the corresponding permutation matrices are transposes. That is, there is a relationship between these corresponding permutations. Problem 6 shows that they are inverses.
- Theorem 4.9
The determinant of a matrix equals the determinant of its transpose.
Call the matrix and denote the entries of with
's so that .
Substitution gives this
and we can finish the argument by manipulating the expression on the right
to be recognizable as the determinant of the transpose.
We have written all permutation expansions
(as in the middle expression above)
with the row indices ascending.
To rewrite the expression on the right in this way, note that
because is a permutation,
the row indices in the term on the right , ...,
are just the numbers , ..., , rearranged.
We can thus commute to have these ascend, giving
(if the column index is and the row index is then,
where the row index is , the column index is ).
Substituting on the right gives
shows that ).
Since every permutation is the inverse of another,
a sum over all is a sum over all permutations