5.23. Summary

5.3.1. Definition:  Determinant Function

The determinant function of order n is defined as

        d: M n,n R  or  C            by      AdetA=d( A 1 ,, A n )

such that the following axioms are satisfied:

Axiom 1.       Homogeneity In Each Row:

d( ,t A i , )=td( , A i , )            where tR  or  C .

Axiom 2.       Invariance Under Row Addition:

d( , A i + A j ,, A j , )=d( , A i ,, A j , )

Axiom 3.       Scale:

detI=d( I 1 ,, I n )=1

5.3.2. Theorem 5.1:  Zero Vector

If some row of A is the zero vector, then detA=0 .

5.3.3. Theorem 5.2.

d( , A i +t A j ,, A j , )=d( , A i ,, A j , )

where ij  and t is a scalar.

5.3.4. Theorem 5.3:  Antisymmetry

(a)      d( , A i ,, A j , )=d( , A j ,, A i , )  for all ij .

(b)  d( , A i ,, A i , )=0

5.4.  Theorem 5.4:  Diagonal Matrix

If A=diag( a 11 , a 22 ,, a nn ) , then

        detA= a 11 a 22 a nn   = i=1 n a ii                             (5.4)

5.5.  Theorem 5.5:  Upper Matrix

For an n ´ n upper diagonal matrix U,

        detU= u 11 u 22 u nn   = i=1 n u ii                             (5.7)

5.6.  Upper Matrix

        detU= ( ) p c 1 c q detA

        detA=c( A )detI                             (5.8)

5.7.  Theorem 5.6.  Uniqueness Theorem For Determinants

        f( A )=d( A )f( I )                          (5.9)

5.9.  Theorem 5.7.  Additivity In Each Row.

        d( , A k +V, )=d( , A k , )+d( ,V, )                (5.10a)

5.10.1. Theorem 5.8:  Dependence

If the rows of a matrix A are dependent, then d( A )=0 .

5.10.2. Theorem 5.9:  Expansion Formula

|    a 11 a 12 a 21 a 22    |   = a 11 a 22 a 12 a 21                               (5.12a)

|    a 11 a 12 a 13 a 21 a 22 a 23 a 31 a 32 a 33    | = a 11 |    a 22 a 23 a 32 a 33    | a 12 |    a 21 a 23 a 31 a 33    |+ a 13 |    a 21 a 22 a 31 a 32    |              (5.12)

5.11.  Lemma 5.10.

        ( AB ) i = A i B                             (5.14)

5.11.  Theorem 5.11.  Product Formula for Determinants.

        det( AB )=( detA )( detB )                        (5.15)

5.12.  Theorem 5.12:  Inverse

det( A 1 )= 1 detA                                      (5.16)

5.13.  Theorem 5.13:  Independence

A set of n vectors { A 1 ,, A n }  in n-space is independent iff d( A 1 ,, A n )0 .

5.14.  Theorem 5.14:  Block Matrix

        det( A O O B )=( detA )( detB )                           (5.17)

5.15.  Expansion by the ith row cofactors

detA= j=1 n a ij cof   a ij                          (5.24)

cof   a ij =det A ij   =d( A 1 ,, I j ,, A n )                                      (5.22)

5.17.  Definition:  Cofactor Matrix.

        cofA= ( cof a ij ) i,j=1 n

5.17.  Theorem 5.16:  Expansion by Cofactors

        A   ( cofA ) t =( detA )I                               (5.25)

If detA0 , then

        A 1 = 1 detA ( cofA ) t                          (5.26)

5.17.  Theorem 5.17:  Nonsingularity

A square matrix A is nonsingular iff detA0 .

5.18.  Theorem 5.18:  Cramer’s Rule.

If A=( a ij )  is nonsingular, the system

        j=1 n a ij x j = b i                               ( i=1,,n  )

has a unique solution given by

        x j = 1 detA k=1 n b k cof a kj              for  j=1,,n                (5.31)

5.19.  Definition:  Minor

The square matrix A kj  of order n1  obtained by deleting the kth row and jth column of A is called the k, j minor of A.

5.19.  Theorem 5.19:  Expansion by kth Row Minors.

        cof a kj =det A kj = ( ) k+j det A kj                  (5.33)

The expansion of detA by the kth row minors is

        detA= j=1 n ( ) k+j a kj det A kj                               (5.34)

5.21.  Theorem 5.20:  Expansion by 1st Column Minors.

Assume determinants of order n-1 exist.  Let

f( A 1 ,, A n )= j=1 n ( ) j+1 a j1 det A j1          (5.37)

Then determinants of arbitrary order n exists.

5.21.  Theorem 5.21:  Transpose

det A t =detA