Differences between revisions 13 and 26 (spanning 13 versions)
Revision 13 as of 2024-01-21 21:49:13
Size: 2540
Comment: Invertible matrices and the zero vector
Revision 26 as of 2026-01-20 18:09:20
Size: 4368
Comment: Note about rank
Deletions are marked like this. Additions are marked like this.
Line 1: Line 1:
= Matrix Inversion = = Invertibility =
Line 3: Line 3:
For some matrices '''''A''''', the '''inverse matrix''' ('''''A'''^-1^'') is a matrix which can be multiplied by the original matrix to produce the [[LinearAlgebra/SpecialMatrices#Identity_Matrix|identity matrix]]. The calculation of an inverse matrix, if it exists, is called '''inversion'''. '''Invertibility''' is a property of square matrices. If a matrix is invertible, there is an inverse matrix that it can be multiplied by to produce the [[LinearAlgebra/SpecialMatrices#Identity_Matrix|identity matrix]]. The calculation of an inverse matrix is '''inversion'''.
Line 13: Line 13:
An inverse matrix satisfies the equation '''''AA'''^-1^ = '''I'''''. A matrix '''''A''''' is invertible if there is a matrix '''''A'''^-1^'' which satisfies '''''AA'''^-1^ = '''A'''^-1^'''A''' = '''I'''''.
Line 15: Line 15:
Not all matrices have an inverse that can satisfy that condition. If '''''A'''^-1^'' exists, then '''''A''''' is '''invertible''' and '''non-singular'''. If a matrix cannot be inverted, it is '''singular''' and '''degenerate''' and '''non-invertible'''.

Only square matrices can be invertible. However, a non-square matrix can separably have distinct '''left inverse''' and '''right inverse''' matrices. Generally, if ''m < n'', then a matrix with shape ''m'' by ''n'' and rank of ''m'' can have a right inverse; a matrix with shape ''n'' by ''m'' and rank of ''m'' can have a left inverse.
Line 21: Line 23:
The core principle of inversions is that a matrix '''''A''''' can be canceled out from a larger equation. '''''AA'''^-1^ = '''I''''', so the two terms cancel out. By definition, '''''AA'''^-1^ = '''A'''^-1^'''A''' = '''I'''''.
Line 23: Line 25:
An invertible matrix has only one vector in the [[LinearAlgebra/NullSpaces|null space]]: the zero vector. An invertible matrix is square and is full [[LinearAlgebra/Rank|rank]].
Line 25: Line 27:
For a [[LinearAlgebra/SpecialMatrices#Permutation_Matrices|permutation matrix]] '''''P''''', the inverse is also the [[LinearAlgebra/MatrixTransposition|transpose]]: '''''P'''^-1^ = '''P'''^T^''. An invertible matrix has only one vector in the [[LinearAlgebra/NullSpace|null space]]: the zero vector. If any basis vector of a matrix is a linear transformation of another, then the matrix does not have [[LinearAlgebra/Basis|basis]] and must be non-invertible.
Line 27: Line 29:
For a square matrix '''''A''''', the '''left inverse''' is the same as the '''right inverse'''. '''''AA'''^-1^ = '''A'''^-1^'''A''' = '''I''''' For [[LinearAlgebra/Orthogonality#Matrices|orthogonal matrices]] (such as [[LinearAlgebra/SpecialMatrices#Permutation_Matrices|permutation matrices]]), the inverse is also the [[LinearAlgebra/Transposition|transpose]]: '''''Q'''^-1^ = '''Q'''^T^''.
Line 33: Line 35:
== Calculation == == Test with Determinant ==
Line 35: Line 37:
Consider the below system, which shows an unknown matrix ('''''A'''^-1^'') multiplied by a known matrix ('''''A''''') creating an [[LinearAlgebra/SpecialMatrices#Identity_Matrix|identity matrix]] ('''''I'''''). The [[LinearAlgebra/Determinant|determinant]] is the most common test for invertibility. If ''|'''A'''| != 0'', then '''''A''''' is invertible. If ''|'''A'''| = 0'', then '''''A''''' is non-invertible.

----



== Calculation with Determinant ==

Consider the ''2 x 2'' case:
Line 38: Line 48:
         -1
  A A = I

┌ ┐┌ ┐ ┌ ┐
│ 1 3││ a b│ │ 1 0│
│ 2 7││ c d│ = │ 0 1│
└ ┘└ ┘ └ ┘
┌ ┐
│ a b │
│ c d │
└ ┘
Line 47: Line 54:
The inverse matrix is calculated with [[LinearAlgebra/Elimination|elimination]] and [[LinearAlgebra/Elimination#Reduced_Row_Echelon_Form|reverse elimination]]. [[LinearAlgebra/Elimination#Simplification_with_Augmented_Matrices|Augment]] '''''A''''' with '''''I'''''. The inverse matrix is
Line 49: Line 56:
The elimination proceeds as: {{{
  1 ┌ ┐
―――――― │ d -b │
Det(A) │ -c a │
       └ ┘
}}}

----



== Calculation with Elimination ==

Because '''''AA'''^-1^ = '''I''''', applying [[LinearAlgebra/Elimination|elimination]] and [[LinearAlgebra/Elimination#Reduced_Row_Echelon_Form|backwards elimination]] on '''''A''''' augmented with an [[LinearAlgebra/SpecialMatrices#Identity_Matrix|identity matrix]] ('''''I''''') will create '''''A'''^-1^'' in the augmentation.
Line 60: Line 80:
}}}

The reverse elimination proceeds as:

{{{
Line 84: Line 99:
----



== Calculation with Determinants and Cofactor Matrices ==

Given the [[LinearAlgebra/Determinant|determinant]] of '''''A''''', it can also be simple to compute '''''A'''^-1^'' as ''(1/|'''A'''|)'''C'''^T^''. '''''C''''' is the cofactor matrix, where ''c,,i j,,'' is the cofactor of ''a,,i j,,''.

For example, given a 2 x 2 '''''A''''' like:

{{{
┌ ┐
│ a b│
│ c d│
└ ┘
}}}

The cofactor matrix '''''C''''' is:

{{{
┌ ┐
│ d -c│
│ -b a│
└ ┘
}}}

But this must be transposed to '''''C'''^T^'':

{{{
┌ ┐
│ d -b│
│ -c a│
└ ┘
}}}

And then '''''A'''^-1'' is:

{{{
┌ ┐
│ (1/det A) * d (1/det A) * -b│
│ (1/det A) * -c (1/det A) * a│
└ ┘
}}}

The above example fits into this formula. The [[LinearAlgebra/Elimination|elimination]] and [[LinearAlgebra/Elimination#Reduced_Row_Echelon_Form|backwards elimination]] prove that the determinant of that '''''A''''' is 1. The more fundamental formula ''ad - bc'' expands to ''1 * 7 - 2 * 3'' which also reveals a determinant of 1. As such, ''(1/|'''A'''|)'' is trivially 1. So simply plug the given (''a'', ''b'', ''c'', ''d'') into the transposed cofactor matrix to find the inverse.

Invertibility

Invertibility is a property of square matrices. If a matrix is invertible, there is an inverse matrix that it can be multiplied by to produce the identity matrix. The calculation of an inverse matrix is inversion.


Definition

A matrix A is invertible if there is a matrix A-1 which satisfies AA-1 = A-1A = I.

If a matrix cannot be inverted, it is singular and degenerate and non-invertible.

Only square matrices can be invertible. However, a non-square matrix can separably have distinct left inverse and right inverse matrices. Generally, if m < n, then a matrix with shape m by n and rank of m can have a right inverse; a matrix with shape n by m and rank of m can have a left inverse.

Properties

By definition, AA-1 = A-1A = I.

An invertible matrix is square and is full rank.

An invertible matrix has only one vector in the null space: the zero vector. If any basis vector of a matrix is a linear transformation of another, then the matrix does not have basis and must be non-invertible.

For orthogonal matrices (such as permutation matrices), the inverse is also the transpose: Q-1 = QT.


Test with Determinant

The determinant is the most common test for invertibility. If |A| != 0, then A is invertible. If |A| = 0, then A is non-invertible.


Calculation with Determinant

Consider the 2 x 2 case:

┌     ┐
│ a b │
│ c d │
└     ┘

The inverse matrix is

  1    ┌      ┐
―――――― │ d -b │
Det(A) │ -c a │
       └      ┘


Calculation with Elimination

Because AA-1 = I, applying elimination and backwards elimination on A augmented with an identity matrix (I) will create A-1 in the augmentation.

┌            ┐
│ [1] 3 │ 1 0│
│  2  7 │ 0 1│
└            ┘
┌               ┐
│ [1]  3  │  1 0│
│  0  [1] │ -2 1│
└               ┘
┌             ┐
│ 1  3  │  1 0│
│ 0 [1] │ -2 1│
└             ┘
┌                ┐
│ [1]  0  │  7 -3│
│  0  [1] │ -2  1│
└                ┘

A-1 is:

┌      ┐
│  7 -3│
│ -2  1│
└      ┘


Calculation with Determinants and Cofactor Matrices

Given the determinant of A, it can also be simple to compute A-1 as (1/|A|)CT. C is the cofactor matrix, where ci j is the cofactor of ai j.

For example, given a 2 x 2 A like:

┌    ┐
│ a b│
│ c d│
└    ┘

The cofactor matrix C is:

┌      ┐
│  d -c│
│ -b  a│
└      ┘

But this must be transposed to CT:

┌      ┐
│  d -b│
│ -c  a│
└      ┘

And then A^-1 is:

┌                               ┐
│  (1/det A) * d  (1/det A) * -b│
│ (1/det A) * -c   (1/det A) * a│
└                               ┘

The above example fits into this formula. The elimination and backwards elimination prove that the determinant of that A is 1. The more fundamental formula ad - bc expands to 1 * 7 - 2 * 3 which also reveals a determinant of 1. As such, (1/|A|) is trivially 1. So simply plug the given (a, b, c, d) into the transposed cofactor matrix to find the inverse.


CategoryRicottone

LinearAlgebra/Invertibility (last edited 2026-01-20 18:09:20 by DominicRicottone)