Differences between revisions 1 and 15 (spanning 14 versions)
Revision 1 as of 2024-01-21 02:20:44
Size: 2078
Comment: Initial commit
Revision 15 as of 2025-09-24 20:19:45
Size: 2145
Comment: Link
Deletions are marked like this. Additions are marked like this.
Line 2: Line 2:

'''Orthogonality''' is a generalization of perpendicularity. '''Orthonormality''' is a related concept, requiring that the components be [[Calculus/UnitVector|unit vectors]].

See also [[Calculus/Orthogonality|vector orthogonality]].

Line 9: Line 15:
== Test == == Orthogonality ==
Line 11: Line 17:
The test for orthogonality of two vectors is '''''X'''^T^'''Y''' = 0''. The notation for orthogonality is '''', as in ''x y''.
Line 13: Line 19:
The Pythagorean theorem specifies that for two sides of a right triangle, ''x'' and ''y'', the hypotenuse ''z'' is characterized by ''x^2^ + y^2^ = z^2^''. For a subspace S to be orthogonal to a subspace T, every vector in S must be orthogonal to every vector in T. [[LinearAlgebra/NullSpace|Null spaces]] are a trivial example. For a given matrix '''''A''''', its null space (i.e., ''N('''A''')'') contains the vectors that are not in the row space (i.e., ''R('''A''')''). Therefore it is orthogonal. Similarly, ''N('''A'''^T^)'' is orthogonal to the column space of '''''A''''' (i.e., ''C('''A''')'')
Line 15: Line 21:
For a vector '''''X''''', the total length can be thought of as the sum of each components' absolute value. If '''''X''''' is ''[1 2 3]'', the length is 6. The squared length can be thought of as the sum of each components' square. For the same '''''X''''', this is 14. This can be generalized as '''''X'''^T^'''X'''''.

For a similar reason, the total length of the hypotenuse can be thought of as the sum of the other two vectors. Instead of characterizing a vector '''''Z''''', we can use '''''(X+Y)^T^(X+Y)'''''. For example, if '''''X''''' is ''[1 2 3]'' and '''''Y''''' is ''[2 -1 0]'', it should be understood that '''''Z''''' is ''[3 1 3]''.

If the vectors '''''X''''' and '''''Y''''' are perpendicular, or '''orthogonal''', then the Pythagorean theorem should hold. '''''X'''^T^'''X''' + '''Y'''^T^'''Y''' = ('''X'''+'''Y''')^T^('''X'''+'''Y''')''. This expands to '''''X'''^T^'''X''' + '''Y'''^T^'''Y''' = '''X'''^T^'''X''' + '''Y'''^T^'''Y''' + '''X'''^T^'''Y''' + '''Y'''^T^'''X'''''. By cancelling out common terms, this simplifies to ''0 = '''X'''^T^'''Y''' + '''Y'''^T^'''X'''''.

It must be understood that the last two terms are the same value. Therefore, this further simplifies to ''0 = 2'''X'''^T^'''Y''''' and finally to ''0 = '''X'''^T^'''Y'''''.
Line 27: Line 26:
== Application == == Orthonormality ==
Line 29: Line 28:
For a subspace S to be orthogonal to a subspace T, every vector in S must be orthogonal to every vector in T. If a matrix is composed of [[LinearAlgebra/Orthonormalization|orthonormal columns]], then it can be called a '''matrix with orthonormal columns'''. These are usually denoted as '''''Q'''''. These have several important properties:
 * '''''Q'''^T^'''Q''' = '''I'''''
 * The [[LinearAlgebra/Projection|projection matrix]] is given as '''''P''' = '''QQ'''^T^''.
Line 31: Line 32:
The [[LinearAlgebra/NullSpaces|null space]] of '''''A''''' (a.k.a. ''N('''A''')'') is orthogonal to the row space of '''''A''''' (a.k.a. ''R('''A''')''). The [[LinearAlgebra/NullSpaces|null space]] of '''''A'''^T^'' (a.k.a. ''N('''A'''^T^)'') is orthogonal to the column space of '''''A''''' (a.k.a. ''C('''A''')''). The second follows from the first. Recall that, when projecting ''b'' into ''C('''A''')'', the projection matrix is given as '''''P''' = '''A'''('''A'''^T^'''A'''^-1^)'''A'''^T^''. This comes from the linear system '''''A'''^T^'''A'''x̂ = '''A'''^T^b'' and requiring that ''p = '''P'''b''. For a matrix '''''Q''''' with orthonormal columns, the first property simplifies the linear system to ''x̂ = '''Q'''^T^b''. Therefore, '''''P''' = '''QQ'''^T^''.

If such a matrix with orthonormal columns is ''also'' square, then it can be called an '''orthogonal matrix'''. These have several important properties:

 * '''''Q'''^T^'''Q''' = '''QQ'''^T^ = '''I'''''
 * '''''Q'''^T^ = '''Q'''^-1^''
 * The [[LinearAlgebra/Determinant|determinant]] is always 1 or -1
 * The projection matrix is given as '''''P''' = '''I''''', indicating that ''b'' must be in ''C('''A''')''.

Orthogonality

Orthogonality is a generalization of perpendicularity. Orthonormality is a related concept, requiring that the components be unit vectors.

See also vector orthogonality.


Orthogonality

The notation for orthogonality is , as in x ⊥ y.

For a subspace S to be orthogonal to a subspace T, every vector in S must be orthogonal to every vector in T. Null spaces are a trivial example. For a given matrix A, its null space (i.e., N(A)) contains the vectors that are not in the row space (i.e., R(A)). Therefore it is orthogonal. Similarly, N(AT) is orthogonal to the column space of A (i.e., C(A))


Orthonormality

If a matrix is composed of orthonormal columns, then it can be called a matrix with orthonormal columns. These are usually denoted as Q. These have several important properties:

The second follows from the first. Recall that, when projecting b into C(A), the projection matrix is given as P = A(ATA-1)AT. This comes from the linear system ATAx̂ = ATb and requiring that p = Pb. For a matrix Q with orthonormal columns, the first property simplifies the linear system to x̂ = QTb. Therefore, P = QQT.

If such a matrix with orthonormal columns is also square, then it can be called an orthogonal matrix. These have several important properties:

  • QTQ = QQT = I

  • QT = Q-1

  • The determinant is always 1 or -1

  • The projection matrix is given as P = I, indicating that b must be in C(A).


CategoryRicottone

LinearAlgebra/Orthogonality (last edited 2025-09-24 20:19:45 by DominicRicottone)