|
Size: 3391
Comment: Matrices
|
Size: 2145
Comment: Link
|
| Deletions are marked like this. | Additions are marked like this. |
| Line 3: | Line 3: |
| '''Orthogonality''' is an important property for relating two vectors, or two subspaces, or a vector and a plane. | '''Orthogonality''' is a generalization of perpendicularity. '''Orthonormality''' is a related concept, requiring that the components be [[Calculus/UnitVector|unit vectors]]. |
| Line 5: | Line 5: |
| '''Orthonormality''' is an expanded concept, requiring that the components be unit vectors. | See also [[Calculus/Orthogonality|vector orthogonality]]. |
| Line 13: | Line 15: |
| == Vectors == | == Orthogonality == |
| Line 15: | Line 17: |
| The concept of '''orthogonality''' is a generalization of '''perpendicularity''' in 2-dimensional space. | The notation for orthogonality is ''⊥'', as in ''x ⊥ y''. |
| Line 17: | Line 19: |
| The Pythagorean theorem specifies that the sides of a right triangle are characterized by ''x^2^ + y^2^ = z^2^''. | For a subspace S to be orthogonal to a subspace T, every vector in S must be orthogonal to every vector in T. [[LinearAlgebra/NullSpace|Null spaces]] are a trivial example. For a given matrix '''''A''''', its null space (i.e., ''N('''A''')'') contains the vectors that are not in the row space (i.e., ''R('''A''')''). Therefore it is orthogonal. Similarly, ''N('''A'''^T^)'' is orthogonal to the column space of '''''A''''' (i.e., ''C('''A''')'') |
| Line 19: | Line 21: |
| For a vector ''x'', the total length can be thought of as the sum of each components' absolute value. If ''x'' is ''[1 2 3]'', the length is 6. The squared length can be thought of as the sum of each components' square. For the same ''x'', this is 14. This can be generalized as ''x^T^x''. For a similar reason, the total length of the hypotenuse can be thought of as the sum of the other two vectors: ''x+y''. Continuing with the example for ''x'', if ''y'' were ''[2 -1 0]'', then ''z'' would be ''[3 1 3]''. Note that the squared length can be written as ''(x+y)^T^(x+y)''. If the vectors ''x'' and ''y'' are perpendicular then the Pythagorean theorem should hold: ''x^T^x + y^T^y = (x+y)^T^(x+y)''. This expands to ''x^T^x + y^T^y = x^T^x + y^T^y + x^T^y + y^T^x''. By cancelling out common terms, this simplifies to ''0 = x^T^y + y^T^x''. It must be understood that the last two terms are the same value. Therefore, this further simplifies to ''0 = 2x^T^y'' and finally to ''0 = x^T^y''. The test for orthogonality of two vectors is ''x^T^y = 0''. |
|
| Line 33: | Line 26: |
| == Subspaces == | == Orthonormality == |
| Line 35: | Line 28: |
| For a subspace S to be orthogonal to a subspace T, every vector in S must be orthogonal to every vector in T. | If a matrix is composed of [[LinearAlgebra/Orthonormalization|orthonormal columns]], then it can be called a '''matrix with orthonormal columns'''. These are usually denoted as '''''Q'''''. These have several important properties: * '''''Q'''^T^'''Q''' = '''I''''' * The [[LinearAlgebra/Projection|projection matrix]] is given as '''''P''' = '''QQ'''^T^''. |
| Line 37: | Line 32: |
| ---- | The second follows from the first. Recall that, when projecting ''b'' into ''C('''A''')'', the projection matrix is given as '''''P''' = '''A'''('''A'''^T^'''A'''^-1^)'''A'''^T^''. This comes from the linear system '''''A'''^T^'''A'''x̂ = '''A'''^T^b'' and requiring that ''p = '''P'''b''. For a matrix '''''Q''''' with orthonormal columns, the first property simplifies the linear system to ''x̂ = '''Q'''^T^b''. Therefore, '''''P''' = '''QQ'''^T^''. |
| Line 39: | Line 34: |
| If such a matrix with orthonormal columns is ''also'' square, then it can be called an '''orthogonal matrix'''. These have several important properties: | |
| Line 40: | Line 36: |
== Vectors and Planes == The [[LinearAlgebra/NullSpaces|null space]] of a matrix '''''A''''' contains the vectors that are not in the row space. These vectors cancel out; they are not a linear combination of the rows; if the row space is a plane, then these vectors are not on that plane. The null space of '''''A''''' (a.k.a. ''N('''A''')'') is '''orthogonal''' to the row space of '''''A''''' (a.k.a. ''R('''A''')''). The null space of '''''A'''^T^'' (a.k.a. ''N('''A'''^T^)'') is orthogonal to the column space of '''''A''''' (a.k.a. ''C('''A''')''). Commonly this means that the row and column spaces are planes while the null spaces of '''''A''''' and '''''A'''^T^'' are vectors, but that isn't always true. ---- == Matrices == If a matrix is composed of orthonormal columns, then it is a '''matrix with orthonormal columns'''. These are usually denoted as '''''Q'''''. If a matrix with orthonormal columns is also square, only then can it be called an '''orthogonal matrix'''. Both categories are useful but there are many properties implied by the latter. 1. '''''Q'''^T^'''Q''' = '''I''''' 2. '''''Q'''^T^ = '''Q'''^-1^'' The [[LinearAlgebra/Projections#Matrices|projection]] of '''''A''''' if '''''A''''' is a matrix with orthonormal columns simplifies from '''''P''' = '''A'''('''A'''^T^'''A'''^-1^)'''A'''^T^'' into '''''P''' = '''QQ'''^T^''. Correspondingly, the system of normal equations simplifies from '''''A'''^T^'''A'''x̂ = '''A'''^T^b'' into ''x̂ = '''Q'''^T^b''. But if '''Q''' is also square, then the projection matrix further simplifies to '''''P''' = '''I'''''. |
* '''''Q'''^T^'''Q''' = '''QQ'''^T^ = '''I''''' * '''''Q'''^T^ = '''Q'''^-1^'' * The [[LinearAlgebra/Determinant|determinant]] is always 1 or -1 * The projection matrix is given as '''''P''' = '''I''''', indicating that ''b'' must be in ''C('''A''')''. |
Orthogonality
Orthogonality is a generalization of perpendicularity. Orthonormality is a related concept, requiring that the components be unit vectors.
See also vector orthogonality.
Contents
Orthogonality
The notation for orthogonality is ⊥, as in x ⊥ y.
For a subspace S to be orthogonal to a subspace T, every vector in S must be orthogonal to every vector in T. Null spaces are a trivial example. For a given matrix A, its null space (i.e., N(A)) contains the vectors that are not in the row space (i.e., R(A)). Therefore it is orthogonal. Similarly, N(AT) is orthogonal to the column space of A (i.e., C(A))
Orthonormality
If a matrix is composed of orthonormal columns, then it can be called a matrix with orthonormal columns. These are usually denoted as Q. These have several important properties:
QTQ = I
The projection matrix is given as P = QQT.
The second follows from the first. Recall that, when projecting b into C(A), the projection matrix is given as P = A(ATA-1)AT. This comes from the linear system ATAx̂ = ATb and requiring that p = Pb. For a matrix Q with orthonormal columns, the first property simplifies the linear system to x̂ = QTb. Therefore, P = QQT.
If such a matrix with orthonormal columns is also square, then it can be called an orthogonal matrix. These have several important properties:
QTQ = QQT = I
QT = Q-1
The determinant is always 1 or -1
The projection matrix is given as P = I, indicating that b must be in C(A).
