Size: 1254
Comment:
|
← Revision 17 as of 2024-06-06 02:58:56 ⇥
Size: 4140
Comment: Content
|
Deletions are marked like this. | Additions are marked like this. |
Line 1: | Line 1: |
= Inverse Matrices = | = Matrix Inversion = |
Line 3: | Line 3: |
== Introduction == | For some matrices '''''A''''', the '''inverse matrix''' ('''''A'''^-1^'') is a matrix which can be multiplied by the original matrix to produce the [[LinearAlgebra/SpecialMatrices#Identity_Matrix|identity matrix]]. The calculation of an inverse matrix, if it exists, is called '''inversion'''. |
Line 5: | Line 5: |
An '''inverse matrix''' is a matrix A^-1^ where multiplying it by matrix A results in the identity matrix. | <<TableOfContents>> |
Line 7: | Line 7: |
Consider the below problem: {{{ ┌ ┐┌ ┐ ┌ ┐ │ 1 3││ a b│ │ 1 0│ │ 2 7││ c d│ = │ 0 1│ └ ┘└ ┘ └ ┘ }}} |
---- |
Line 18: | Line 11: |
== Gauss-Jordan Calculation == | == Definition == |
Line 20: | Line 13: |
The inverse matrix can be calculated through elimination and reverse elimination. | An inverse matrix satisfies the equation '''''AA'''^-1^ = '''I'''''. |
Line 22: | Line 15: |
First step: | Not all matrices have a real inverse. Such matrices are [[LinearAlgebra/MatrixProperties#Interible|non-invertible]]. Calculation of the [[LinearAlgebra/Determinants|determinant]] is the common test for invertibility. if ''|'''A'''| != 0'', then '''''A''''' is invertible. Conversely, if ''|'''A'''| = 0'', then '''''A''''' is non-invertible. Only square matrices can be invertible. However, a non-square matrix can separably have distinct '''left inverse''' and '''right inverse''' matrices. Generally, if ''a < b'', then a matrix with shape ''a'' by ''b'' and rank of ''a'' can have a right inverse; a matrix with shape ''b'' by ''a'' and rank of ''a'' can have a left inverse. By contrast, an invertible square matrix has a true inverse that works on either side: '''''AA'''^-1^ = '''A'''^-1^'''A''' = '''I'''''. ---- == Properties == The core principle of inversions is that a matrix '''''A''''' can be canceled out from a larger system: ''x'''AA'''^-1^ = x''. An invertible matrix has only one vector in the [[LinearAlgebra/NullSpaces|null space]]: the zero vector. If any basis vector of a matrix is a linear transformation of another, then the matrix does not have [[LinearAlgebra/Basis|basis]] and must be non-invertible. For [[LinearAlgebra/Orthogonality#Matrices|orthogonal matrices]] (such as [[LinearAlgebra/SpecialMatrices#Permutation_Matrices|permutation matrices]]), the inverse is also the [[LinearAlgebra/MatrixTransposition|transpose]]: '''''Q'''^-1^ = '''Q'''^T^''. ---- == Calculation == Because '''''AA'''^-1^ = '''I''''', applying [[LinearAlgebra/Elimination|elimination]] and [[LinearAlgebra/Elimination#Reduced_Row_Echelon_Form|backwards elimination]] on '''''A''''' augmented with an [[LinearAlgebra/SpecialMatrices#Identity_Matrix|identity matrix]] ('''''I''''') will create '''''A'''^-1^'' in the augmentation. |
Line 29: | Line 46: |
2 - 1m = 0 m = 2 2 7 0 1 - 1m - 3m - 1m - 0m ____ ____ ____ ____ 0 1 -2 1 ┌ ┐ │ [1] 3 │ 1 0│ │ 0 1 │ -2 1│ └ ┘ }}} Second step: {{{ |
┌ ┐ │ [1] 3 │ 1 0│ │ 0 [1] │ -2 1│ └ ┘ |
Line 51: | Line 54: |
3 - 1m = 0 m = 3 1 3 1 0 - 0m - 1m - -2m - 1m ____ ____ _____ ____ 1 0 7 -3 ┌ ┐ │ 1 0 │ 7 -3│ │ 0 [1] │ -2 1│ └ ┘ |
┌ ┐ │ [1] 0 │ 7 -3│ │ 0 [1] │ -2 1│ └ ┘ |
Line 66: | Line 60: |
The inverse matrix of A is: | '''''A'''^-1^'' is: |
Line 75: | Line 69: |
---- == Determinant and Cofactor Matrix == Given the [[LinearAlgebra/Determinants|determinant]] of '''''A''''', it can also be simple to compute '''''A'''^-1^'' as ''(1/|'''A'''|)'''C'''^T^''. '''''C''''' is the cofactor matrix, where ''c,,i j,,'' is the cofactor of ''a,,i j,,''. For example, given a 2 x 2 '''''A''''' like: {{{ ┌ ┐ │ a b│ │ c d│ └ ┘ }}} The cofactor matrix '''''C''''' is: {{{ ┌ ┐ │ d -c│ │ -b a│ └ ┘ }}} But this must be transposed to '''''C'''^T^'': {{{ ┌ ┐ │ d -b│ │ -c a│ └ ┘ }}} And then '''''A'''^-1'' is: {{{ ┌ ┐ │ (1/det A) * d (1/det A) * -b│ │ (1/det A) * -c (1/det A) * a│ └ ┘ }}} The above example fits into this formula. The [[LinearAlgebra/Elimination|elimination]] and [[LinearAlgebra/Elimination#Reduced_Row_Echelon_Form|backwards elimination]] prove that the determinant of that '''''A''''' is 1. The more fundamental formula ''ad - bc'' expands to ''1 * 7 - 2 * 3'' which also reveals a determinant of 1. As such, ''(1/|'''A'''|)'' is trivially 1. So simply plug the given (''a'', ''b'', ''c'', ''d'') into the transposed cofactor matrix to find the inverse. |
Matrix Inversion
For some matrices A, the inverse matrix (A-1) is a matrix which can be multiplied by the original matrix to produce the identity matrix. The calculation of an inverse matrix, if it exists, is called inversion.
Definition
An inverse matrix satisfies the equation AA-1 = I.
Not all matrices have a real inverse. Such matrices are non-invertible.
Calculation of the determinant is the common test for invertibility. if |A| != 0, then A is invertible. Conversely, if |A| = 0, then A is non-invertible.
Only square matrices can be invertible. However, a non-square matrix can separably have distinct left inverse and right inverse matrices. Generally, if a < b, then a matrix with shape a by b and rank of a can have a right inverse; a matrix with shape b by a and rank of a can have a left inverse. By contrast, an invertible square matrix has a true inverse that works on either side: AA-1 = A-1A = I.
Properties
The core principle of inversions is that a matrix A can be canceled out from a larger system: xAA-1 = x.
An invertible matrix has only one vector in the null space: the zero vector. If any basis vector of a matrix is a linear transformation of another, then the matrix does not have basis and must be non-invertible.
For orthogonal matrices (such as permutation matrices), the inverse is also the transpose: Q-1 = QT.
Calculation
Because AA-1 = I, applying elimination and backwards elimination on A augmented with an identity matrix (I) will create A-1 in the augmentation.
┌ ┐ │ [1] 3 │ 1 0│ │ 2 7 │ 0 1│ └ ┘ ┌ ┐ │ [1] 3 │ 1 0│ │ 0 [1] │ -2 1│ └ ┘ ┌ ┐ │ 1 3 │ 1 0│ │ 0 [1] │ -2 1│ └ ┘ ┌ ┐ │ [1] 0 │ 7 -3│ │ 0 [1] │ -2 1│ └ ┘
A-1 is:
┌ ┐ │ 7 -3│ │ -2 1│ └ ┘
Determinant and Cofactor Matrix
Given the determinant of A, it can also be simple to compute A-1 as (1/|A|)CT. C is the cofactor matrix, where ci j is the cofactor of ai j.
For example, given a 2 x 2 A like:
┌ ┐ │ a b│ │ c d│ └ ┘
The cofactor matrix C is:
┌ ┐ │ d -c│ │ -b a│ └ ┘
But this must be transposed to CT:
┌ ┐ │ d -b│ │ -c a│ └ ┘
And then A^-1 is:
┌ ┐ │ (1/det A) * d (1/det A) * -b│ │ (1/det A) * -c (1/det A) * a│ └ ┘
The above example fits into this formula. The elimination and backwards elimination prove that the determinant of that A is 1. The more fundamental formula ad - bc expands to 1 * 7 - 2 * 3 which also reveals a determinant of 1. As such, (1/|A|) is trivially 1. So simply plug the given (a, b, c, d) into the transposed cofactor matrix to find the inverse.