|
Size: 1254
Comment:
|
Size: 4043
Comment: Simplifying matrix page names
|
| Deletions are marked like this. | Additions are marked like this. |
| Line 1: | Line 1: |
| = Inverse Matrices = | ## page was renamed from LinearAlgebra/Invertability ## page was renamed from LinearAlgebra/MatrixInversion = Invertability = |
| Line 3: | Line 5: |
| == Introduction == | '''Invertability''' is a property of square matrices. If a matrix is invertible, there is an inverse matrix that it can be multiplied by to produce the [[LinearAlgebra/SpecialMatrices#Identity_Matrix|identity matrix]]. The calculation of an inverse matrix is '''inversion'''. |
| Line 5: | Line 7: |
| An '''inverse matrix''' is a matrix A^-1^ where multiplying it by matrix A results in the identity matrix. | <<TableOfContents>> |
| Line 7: | Line 9: |
| Consider the below problem: {{{ ┌ ┐┌ ┐ ┌ ┐ │ 1 3││ a b│ │ 1 0│ │ 2 7││ c d│ = │ 0 1│ └ ┘└ ┘ └ ┘ }}} |
---- |
| Line 18: | Line 13: |
| == Gauss-Jordan Calculation == | == Definition == |
| Line 20: | Line 15: |
| The inverse matrix can be calculated through elimination and reverse elimination. | A matrix '''''A''''' is invertible if there is a matrix '''''A'''^-1^'' which satisfies '''''AA'''^-1^ = '''A'''^-1^'''A''' = '''I'''''. |
| Line 22: | Line 17: |
| First step: | Only square matrices can be invertible. However, a non-square matrix can separably have distinct '''left inverse''' and '''right inverse''' matrices. Generally, if ''m < n'', then a matrix with shape ''m'' by ''n'' and rank of ''m'' can have a right inverse; a matrix with shape ''n'' by ''m'' and rank of ''m'' can have a left inverse. ---- == Determinant == The [[LinearAlgebra/Determinant|determinant]] is the most common test for invertibility. If ''|'''A'''| != 0'', then '''''A''''' is invertible. If ''|'''A'''| = 0'', then '''''A''''' is non-invertible. ---- == Properties == The core principle of inversions is that a matrix '''''A''''' can be canceled out from a larger system: ''x'''AA'''^-1^ = x''. An invertible matrix has only one vector in the [[LinearAlgebra/NullSpace|null space]]: the zero vector. If any basis vector of a matrix is a linear transformation of another, then the matrix does not have [[LinearAlgebra/Basis|basis]] and must be non-invertible. For [[LinearAlgebra/Orthogonality#Matrices|orthogonal matrices]] (such as [[LinearAlgebra/SpecialMatrices#Permutation_Matrices|permutation matrices]]), the inverse is also the [[LinearAlgebra/Transposition|transpose]]: '''''Q'''^-1^ = '''Q'''^T^''. ---- == Calculation == Because '''''AA'''^-1^ = '''I''''', applying [[LinearAlgebra/Elimination|elimination]] and [[LinearAlgebra/Elimination#Reduced_Row_Echelon_Form|backwards elimination]] on '''''A''''' augmented with an [[LinearAlgebra/SpecialMatrices#Identity_Matrix|identity matrix]] ('''''I''''') will create '''''A'''^-1^'' in the augmentation. |
| Line 29: | Line 52: |
2 - 1m = 0 m = 2 2 7 0 1 - 1m - 3m - 1m - 0m ____ ____ ____ ____ 0 1 -2 1 ┌ ┐ │ [1] 3 │ 1 0│ │ 0 1 │ -2 1│ └ ┘ }}} Second step: {{{ |
┌ ┐ │ [1] 3 │ 1 0│ │ 0 [1] │ -2 1│ └ ┘ |
| Line 51: | Line 60: |
3 - 1m = 0 m = 3 1 3 1 0 - 0m - 1m - -2m - 1m ____ ____ _____ ____ 1 0 7 -3 ┌ ┐ │ 1 0 │ 7 -3│ │ 0 [1] │ -2 1│ └ ┘ |
┌ ┐ │ [1] 0 │ 7 -3│ │ 0 [1] │ -2 1│ └ ┘ |
| Line 66: | Line 66: |
| The inverse matrix of A is: | '''''A'''^-1^'' is: |
| Line 75: | Line 75: |
| ---- == Determinant and Cofactor Matrix == Given the [[LinearAlgebra/Determinant|determinant]] of '''''A''''', it can also be simple to compute '''''A'''^-1^'' as ''(1/|'''A'''|)'''C'''^T^''. '''''C''''' is the cofactor matrix, where ''c,,i j,,'' is the cofactor of ''a,,i j,,''. For example, given a 2 x 2 '''''A''''' like: {{{ ┌ ┐ │ a b│ │ c d│ └ ┘ }}} The cofactor matrix '''''C''''' is: {{{ ┌ ┐ │ d -c│ │ -b a│ └ ┘ }}} But this must be transposed to '''''C'''^T^'': {{{ ┌ ┐ │ d -b│ │ -c a│ └ ┘ }}} And then '''''A'''^-1'' is: {{{ ┌ ┐ │ (1/det A) * d (1/det A) * -b│ │ (1/det A) * -c (1/det A) * a│ └ ┘ }}} The above example fits into this formula. The [[LinearAlgebra/Elimination|elimination]] and [[LinearAlgebra/Elimination#Reduced_Row_Echelon_Form|backwards elimination]] prove that the determinant of that '''''A''''' is 1. The more fundamental formula ''ad - bc'' expands to ''1 * 7 - 2 * 3'' which also reveals a determinant of 1. As such, ''(1/|'''A'''|)'' is trivially 1. So simply plug the given (''a'', ''b'', ''c'', ''d'') into the transposed cofactor matrix to find the inverse. |
Invertability
Invertability is a property of square matrices. If a matrix is invertible, there is an inverse matrix that it can be multiplied by to produce the identity matrix. The calculation of an inverse matrix is inversion.
Contents
Definition
A matrix A is invertible if there is a matrix A-1 which satisfies AA-1 = A-1A = I.
Only square matrices can be invertible. However, a non-square matrix can separably have distinct left inverse and right inverse matrices. Generally, if m < n, then a matrix with shape m by n and rank of m can have a right inverse; a matrix with shape n by m and rank of m can have a left inverse.
Determinant
The determinant is the most common test for invertibility. If |A| != 0, then A is invertible. If |A| = 0, then A is non-invertible.
Properties
The core principle of inversions is that a matrix A can be canceled out from a larger system: xAA-1 = x.
An invertible matrix has only one vector in the null space: the zero vector. If any basis vector of a matrix is a linear transformation of another, then the matrix does not have basis and must be non-invertible.
For orthogonal matrices (such as permutation matrices), the inverse is also the transpose: Q-1 = QT.
Calculation
Because AA-1 = I, applying elimination and backwards elimination on A augmented with an identity matrix (I) will create A-1 in the augmentation.
┌ ┐ │ [1] 3 │ 1 0│ │ 2 7 │ 0 1│ └ ┘ ┌ ┐ │ [1] 3 │ 1 0│ │ 0 [1] │ -2 1│ └ ┘ ┌ ┐ │ 1 3 │ 1 0│ │ 0 [1] │ -2 1│ └ ┘ ┌ ┐ │ [1] 0 │ 7 -3│ │ 0 [1] │ -2 1│ └ ┘
A-1 is:
┌ ┐ │ 7 -3│ │ -2 1│ └ ┘
Determinant and Cofactor Matrix
Given the determinant of A, it can also be simple to compute A-1 as (1/|A|)CT. C is the cofactor matrix, where ci j is the cofactor of ai j.
For example, given a 2 x 2 A like:
┌ ┐ │ a b│ │ c d│ └ ┘
The cofactor matrix C is:
┌ ┐ │ d -c│ │ -b a│ └ ┘
But this must be transposed to CT:
┌ ┐ │ d -b│ │ -c a│ └ ┘
And then A^-1 is:
┌ ┐ │ (1/det A) * d (1/det A) * -b│ │ (1/det A) * -c (1/det A) * a│ └ ┘
The above example fits into this formula. The elimination and backwards elimination prove that the determinant of that A is 1. The more fundamental formula ad - bc expands to 1 * 7 - 2 * 3 which also reveals a determinant of 1. As such, (1/|A|) is trivially 1. So simply plug the given (a, b, c, d) into the transposed cofactor matrix to find the inverse.
