|
Size: 2145
Comment: Link
|
← Revision 16 as of 2026-02-04 00:30:26 ⇥
Size: 3051
Comment: Cleanup
|
| Deletions are marked like this. | Additions are marked like this. |
| Line 6: | Line 6: |
| Line 15: | Line 13: |
| == Orthogonality == | == Description == |
| Line 17: | Line 15: |
| The notation for orthogonality is ''⊥'', as in ''x ⊥ y''. | '''Orthogonality''' is an extension of perpendicularity to higher dimensions of Euclidean space, and also to arbitrary [[LinearAlgebra/InnerProduct|inner product spaces]]. To notate that ''x'' is orthogonal to ''y'', use ''x ⊥ y''. |
| Line 19: | Line 17: |
| For a subspace S to be orthogonal to a subspace T, every vector in S must be orthogonal to every vector in T. [[LinearAlgebra/NullSpace|Null spaces]] are a trivial example. For a given matrix '''''A''''', its null space (i.e., ''N('''A''')'') contains the vectors that are not in the row space (i.e., ''R('''A''')''). Therefore it is orthogonal. Similarly, ''N('''A'''^T^)'' is orthogonal to the column space of '''''A''''' (i.e., ''C('''A''')'') | '''Orthonormality''' is a further constraint: the orthogonal vectors are also [[Calculus/UnitVector|unit vectors]]. |
| Line 21: | Line 19: |
---- |
For a two vectors (which are ''not'' the zero vector) to be orthogonal, their inner product should be equal to 0. For a set of vectors to be orthogonal, every possible pair from the set should be orthogonal. |
| Line 26: | Line 23: |
| == Orthonormality == | === Matrices === |
| Line 28: | Line 25: |
| If a matrix is composed of [[LinearAlgebra/Orthonormalization|orthonormal columns]], then it can be called a '''matrix with orthonormal columns'''. These are usually denoted as '''''Q'''''. These have several important properties: | A matrix is effectively a set of vectors. If the columns are orthonormal, then it can be called a '''matrix with orthonormal columns'''. These are usually denoted as '''''Q'''''. Matrices with orthonormal columns have several important properties: |
| Line 34: | Line 33: |
| If such a matrix with orthonormal columns is ''also'' square, then it can be called an '''orthogonal matrix'''. These have several important properties: | If and ''only'' if such a matrix is square, it can be called an '''orthogonal matrix'''. These have several further properties: |
| Line 43: | Line 42: |
| === Subspaces === For two subspaces to be orthogonal, every vector in the span of one should be orthogonal to every vector in the span of the other. For example, consider a plane in ''R^3^''. A plane is a subspace spanned by 2 vectors. The subspace that is orthogonal to a plane must be spanned by 1 vector, i.e. it is a line. A plane and a line can be checked for orthogonality by comparing each of the vectors spanning the plane for orthogonality with the single vector spanning the line. For another example, [[LinearAlgebra/NullSpace|null spaces]] are orthogonal by definiton. For any subspace '''''A''''', the vectors spanning ''N('''A''')'' are precisely those that are not in ''R('''A''')'', therefore they are orthogonal. Similarly, the vectors spanning ''N('''A'''^T^)'' are precisely those that are not in ''C('''A''')'', therefore they are orthogonal. |
Orthogonality
Orthogonality is a generalization of perpendicularity. Orthonormality is a related concept, requiring that the components be unit vectors.
See also vector orthogonality.
Contents
Description
Orthogonality is an extension of perpendicularity to higher dimensions of Euclidean space, and also to arbitrary inner product spaces. To notate that x is orthogonal to y, use x ⊥ y.
Orthonormality is a further constraint: the orthogonal vectors are also unit vectors.
For a two vectors (which are not the zero vector) to be orthogonal, their inner product should be equal to 0. For a set of vectors to be orthogonal, every possible pair from the set should be orthogonal.
Matrices
A matrix is effectively a set of vectors. If the columns are orthonormal, then it can be called a matrix with orthonormal columns. These are usually denoted as Q.
Matrices with orthonormal columns have several important properties:
QTQ = I
The projection matrix is given as P = QQT.
The second follows from the first. Recall that, when projecting b into C(A), the projection matrix is given as P = A(ATA-1)AT. This comes from the linear system ATAx̂ = ATb and requiring that p = Pb. For a matrix Q with orthonormal columns, the first property simplifies the linear system to x̂ = QTb. Therefore, P = QQT.
If and only if such a matrix is square, it can be called an orthogonal matrix. These have several further properties:
QTQ = QQT = I
QT = Q-1
The determinant is always 1 or -1
The projection matrix is given as P = I, indicating that b must be in C(A).
Subspaces
For two subspaces to be orthogonal, every vector in the span of one should be orthogonal to every vector in the span of the other.
For example, consider a plane in R3. A plane is a subspace spanned by 2 vectors. The subspace that is orthogonal to a plane must be spanned by 1 vector, i.e. it is a line. A plane and a line can be checked for orthogonality by comparing each of the vectors spanning the plane for orthogonality with the single vector spanning the line.
For another example, null spaces are orthogonal by definiton. For any subspace A, the vectors spanning N(A) are precisely those that are not in R(A), therefore they are orthogonal. Similarly, the vectors spanning N(AT) are precisely those that are not in C(A), therefore they are orthogonal.
