|
Size: 2664
Comment: Moved Euclidean distance elsewhere
|
Size: 2145
Comment: Link
|
| Deletions are marked like this. | Additions are marked like this. |
| Line 3: | Line 3: |
| '''Orthogonality''' is an important property for relating two vectors, or two subspaces, or a vector and a plane. The math notation is ⊥, as in ''x ⊥ y''. | '''Orthogonality''' is a generalization of perpendicularity. '''Orthonormality''' is a related concept, requiring that the components be [[Calculus/UnitVector|unit vectors]]. |
| Line 5: | Line 5: |
| '''Orthonormality''' is an expanded concept, requiring that the components be unit vectors. | See also [[Calculus/Orthogonality|vector orthogonality]]. |
| Line 13: | Line 15: |
| == Vectors == | == Orthogonality == |
| Line 15: | Line 17: |
| The concept of '''orthogonality''' is a generalization of '''perpendicularity''' in 2-dimensional space. | The notation for orthogonality is ''⊥'', as in ''x ⊥ y''. |
| Line 17: | Line 19: |
| Two vectors are proven to be orthogonal if they obey the [[LinearAlgebra/Distance#n_Dimensions|Pythagorean theorem]]. In the case of two vectors, this test simplifies to ''x^T^y = 0''. | For a subspace S to be orthogonal to a subspace T, every vector in S must be orthogonal to every vector in T. [[LinearAlgebra/NullSpace|Null spaces]] are a trivial example. For a given matrix '''''A''''', its null space (i.e., ''N('''A''')'') contains the vectors that are not in the row space (i.e., ''R('''A''')''). Therefore it is orthogonal. Similarly, ''N('''A'''^T^)'' is orthogonal to the column space of '''''A''''' (i.e., ''C('''A''')'') |
| Line 19: | Line 21: |
| ''x^T^x + y^T^y = (x+y)^T^(x+y)'' expands to ''x^T^x + y^T^y = x^T^x + y^T^y + x^T^y + y^T^x'', simplifies to ''0 = x^T^y + y^T^x'', is trivially proven to be equivalent to ''0 = 2x^T^y'', and finally simplifies to ''0 = x^T^y''. | |
| Line 25: | Line 26: |
| == Subspaces == | == Orthonormality == |
| Line 27: | Line 28: |
| For a subspace S to be orthogonal to a subspace T, every vector in S must be orthogonal to every vector in T. | If a matrix is composed of [[LinearAlgebra/Orthonormalization|orthonormal columns]], then it can be called a '''matrix with orthonormal columns'''. These are usually denoted as '''''Q'''''. These have several important properties: * '''''Q'''^T^'''Q''' = '''I''''' * The [[LinearAlgebra/Projection|projection matrix]] is given as '''''P''' = '''QQ'''^T^''. |
| Line 29: | Line 32: |
| ---- | The second follows from the first. Recall that, when projecting ''b'' into ''C('''A''')'', the projection matrix is given as '''''P''' = '''A'''('''A'''^T^'''A'''^-1^)'''A'''^T^''. This comes from the linear system '''''A'''^T^'''A'''x̂ = '''A'''^T^b'' and requiring that ''p = '''P'''b''. For a matrix '''''Q''''' with orthonormal columns, the first property simplifies the linear system to ''x̂ = '''Q'''^T^b''. Therefore, '''''P''' = '''QQ'''^T^''. |
| Line 31: | Line 34: |
| If such a matrix with orthonormal columns is ''also'' square, then it can be called an '''orthogonal matrix'''. These have several important properties: | |
| Line 32: | Line 36: |
== Vectors and Planes == The [[LinearAlgebra/NullSpaces|null space]] of a matrix '''''A''''' contains the vectors that are not in the row space. These vectors cancel out; they are not a linear combination of the rows; if the row space is a plane, then these vectors are not on that plane. The null space of '''''A''''' (a.k.a. ''N('''A''')'') is '''orthogonal''' to the row space of '''''A''''' (a.k.a. ''R('''A''')''). The null space of '''''A'''^T^'' (a.k.a. ''N('''A'''^T^)'') is orthogonal to the column space of '''''A''''' (a.k.a. ''C('''A''')''). Commonly this means that the row and column spaces are planes while the null spaces of '''''A''''' and '''''A'''^T^'' are vectors, but that isn't always true. ---- == Matrices == If a matrix is composed of orthonormal columns, then it is a '''matrix with orthonormal columns'''. These are usually denoted as '''''Q'''''. This has an important property: '''''Q'''^T^'''Q''' = '''I'''''. The [[LinearAlgebra/Projections#Matrices|projection]] of '''''A''''' if '''''A''''' is a matrix with orthonormal columns simplifies from '''''P''' = '''A'''('''A'''^T^'''A'''^-1^)'''A'''^T^'' into '''''P''' = '''QQ'''^T^''. Correspondingly, the system of normal equations simplifies from '''''A'''^T^'''A'''x̂ = '''A'''^T^b'' into ''x̂ = '''Q'''^T^b''. If a matrix with orthonormal columns is ''also'' square, only then can it be called an '''orthogonal matrix'''. This has an additional important property: '''''Q'''^T^ = '''Q'''^-1^''. For example, if '''Q''' is square, then the projection matrix further simplifies to '''''P''' = '''I'''''. |
* '''''Q'''^T^'''Q''' = '''QQ'''^T^ = '''I''''' * '''''Q'''^T^ = '''Q'''^-1^'' * The [[LinearAlgebra/Determinant|determinant]] is always 1 or -1 * The projection matrix is given as '''''P''' = '''I''''', indicating that ''b'' must be in ''C('''A''')''. |
Orthogonality
Orthogonality is a generalization of perpendicularity. Orthonormality is a related concept, requiring that the components be unit vectors.
See also vector orthogonality.
Contents
Orthogonality
The notation for orthogonality is ⊥, as in x ⊥ y.
For a subspace S to be orthogonal to a subspace T, every vector in S must be orthogonal to every vector in T. Null spaces are a trivial example. For a given matrix A, its null space (i.e., N(A)) contains the vectors that are not in the row space (i.e., R(A)). Therefore it is orthogonal. Similarly, N(AT) is orthogonal to the column space of A (i.e., C(A))
Orthonormality
If a matrix is composed of orthonormal columns, then it can be called a matrix with orthonormal columns. These are usually denoted as Q. These have several important properties:
QTQ = I
The projection matrix is given as P = QQT.
The second follows from the first. Recall that, when projecting b into C(A), the projection matrix is given as P = A(ATA-1)AT. This comes from the linear system ATAx̂ = ATb and requiring that p = Pb. For a matrix Q with orthonormal columns, the first property simplifies the linear system to x̂ = QTb. Therefore, P = QQT.
If such a matrix with orthonormal columns is also square, then it can be called an orthogonal matrix. These have several important properties:
QTQ = QQT = I
QT = Q-1
The determinant is always 1 or -1
The projection matrix is given as P = I, indicating that b must be in C(A).
