Linear Algebra/Determinants

Linear Algebra
 ← Topic: Orthonormal Matrices Determinants Definition of Determinant → 

In the first chapter of this book we considered linear systems and we picked out the special case of systems with the same number of equations as unknowns, those of the form where is a square matrix. We noted a distinction between two classes of 's. While such systems may have a unique solution or no solutions or infinitely many solutions, if a particular is associated with a unique solution in any system, such as the homogeneous system , then is associated with a unique solution for every . We call such a matrix of coefficients "nonsingular". The other kind of , where every linear system for which it is the matrix of coefficients has either no solution or infinitely many solutions, we call "singular".

Through the second and third chapters the value of this distinction has been a theme. For instance, we now know that nonsingularity of an matrix is equivalent to each of these:

  1. a system has a solution, and that solution is unique;
  2. Gauss-Jordan reduction of yields an identity matrix;
  3. the rows of form a linearly independent set;
  4. the columns of form a basis for ;
  5. any map that represents is an isomorphism;
  6. an inverse matrix exists.

So when we look at a particular square matrix, the question of whether it is nonsingular is one of the first things that we ask. This chapter develops a formula to determine this. (Since we will restrict the discussion to square matrices, in this chapter we will usually simply say "matrix" in place of "square matrix".)

More precisely, we will develop infinitely many formulas, one for matrices, one for matrices, etc. Of course, these formulas are related — that is, we will develop a family of formulas, a scheme that describes the formula for each size.

Linear Algebra
 ← Topic: Orthonormal Matrices Determinants Definition of Determinant →