Differences between revisions 1 and 8 (spanning 7 versions)
Revision 1 as of 2024-01-21 02:20:44
Size: 2078
Comment: Initial commit
Revision 8 as of 2025-03-28 03:10:35
Size: 2755
Comment: Rephrased everything
Deletions are marked like this. Additions are marked like this.
Line 2: Line 2:

'''Orthogonality''' is an important property for relating two vectors, or two subspaces, or a vector and a plane. The math notation is ⊥, as in ''x ⊥ y''.

'''Orthonormality''' is an expanded concept, requiring that the components be unit vectors.
Line 9: Line 13:
== Test == == Vectors ==
Line 11: Line 15:
The test for orthogonality of two vectors is '''''X'''^T^'''Y''' = 0''. The concept of '''orthogonality''' is a generalization of '''perpendicularity''' from 2-dimensional space.
Line 13: Line 17:
The Pythagorean theorem specifies that for two sides of a right triangle, ''x'' and ''y'', the hypotenuse ''z'' is characterized by ''x^2^ + y^2^ = z^2^''. Put simply, vectors ''a'' and ''b'' are proven to be orthogonal if their [[LinearAlgebra/VectorMultiplication#Dot_Product|dot product]] is 0.
Line 15: Line 19:
For a vector '''''X''''', the total length can be thought of as the sum of each components' absolute value. If '''''X''''' is ''[1 2 3]'', the length is 6. The squared length can be thought of as the sum of each components' square. For the same '''''X''''', this is 14. This can be generalized as '''''X'''^T^'''X'''''. More precisely: assuming orthogonality, vectors ''a'' and ''b'' will satisfy the Pythagorean theorem. The hypotenuse is [[LinearAlgebra/Distance|Euclidean distance]]: ''(a+b)^T^(a+b)''. Simplifying from there:
Line 17: Line 21:
For a similar reason, the total length of the hypotenuse can be thought of as the sum of the other two vectors. Instead of characterizing a vector '''''Z''''', we can use '''''(X+Y)^T^(X+Y)'''''. For example, if '''''X''''' is ''[1 2 3]'' and '''''Y''''' is ''[2 -1 0]'', it should be understood that '''''Z''''' is ''[3 1 3]''. ''a^T^a + b^T^b = (a+b)^T^(a+b)''
Line 19: Line 23:
If the vectors '''''X''''' and '''''Y''''' are perpendicular, or '''orthogonal''', then the Pythagorean theorem should hold. '''''X'''^T^'''X''' + '''Y'''^T^'''Y''' = ('''X'''+'''Y''')^T^('''X'''+'''Y''')''. This expands to '''''X'''^T^'''X''' + '''Y'''^T^'''Y''' = '''X'''^T^'''X''' + '''Y'''^T^'''Y''' + '''X'''^T^'''Y''' + '''Y'''^T^'''X'''''. By cancelling out common terms, this simplifies to ''0 = '''X'''^T^'''Y''' + '''Y'''^T^'''X'''''. ''a^T^a + b^T^b = a^T^a + b^T^b + a^T^b + b^T^a''
Line 21: Line 25:
It must be understood that the last two terms are the same value. Therefore, this further simplifies to ''0 = 2'''X'''^T^'''Y''''' and finally to ''0 = '''X'''^T^'''Y'''''. ''0 = a^T^b + b^T^a''

''0 = 2(a^T^b)''

''0 = a^T^b''
Line 27: Line 35:
== Application == == Subspaces ==
Line 31: Line 39:
The [[LinearAlgebra/NullSpaces|null space]] of '''''A''''' (a.k.a. ''N('''A''')'') is orthogonal to the row space of '''''A''''' (a.k.a. ''R('''A''')''). The [[LinearAlgebra/NullSpaces|null space]] of '''''A'''^T^'' (a.k.a. ''N('''A'''^T^)'') is orthogonal to the column space of '''''A''''' (a.k.a. ''C('''A''')''). ----



== Vectors and Planes ==

The [[LinearAlgebra/NullSpaces|null space]] of a matrix '''''A''''' contains the vectors that are not in the row space. These vectors cancel out; they are not a linear combination of the rows; if the row space is a plane, then these vectors are not on that plane.

The null space of '''''A''''' (a.k.a. ''N('''A''')'') is '''orthogonal''' to the row space of '''''A''''' (a.k.a. ''R('''A''')''). The null space of '''''A'''^T^'' (a.k.a. ''N('''A'''^T^)'') is orthogonal to the column space of '''''A''''' (a.k.a. ''C('''A''')''). Commonly this means that the row and column spaces are planes while the null spaces of '''''A''''' and '''''A'''^T^'' are vectors, but that isn't always true.

----



== Matrices ==

If a matrix is composed of orthonormal columns, then it is a '''matrix with orthonormal columns'''. These are usually denoted as '''''Q'''''. This has an important property: '''''Q'''^T^'''Q''' = '''I'''''.

The [[LinearAlgebra/Projections#Matrices|projection]] of '''''A''''' if '''''A''''' is a matrix with orthonormal columns simplifies from '''''P''' = '''A'''('''A'''^T^'''A'''^-1^)'''A'''^T^'' into '''''P''' = '''QQ'''^T^''. Correspondingly, the system of normal equations simplifies from '''''A'''^T^'''A'''x̂ = '''A'''^T^b'' into ''x̂ = '''Q'''^T^b''.

If a matrix with orthonormal columns is ''also'' square, only then can it be called an '''orthogonal matrix'''. This has an additional important property: '''''Q'''^T^ = '''Q'''^-1^''.

For example, if '''Q''' is square, then the projection matrix further simplifies to '''''P''' = '''I'''''.

Orthogonality

Orthogonality is an important property for relating two vectors, or two subspaces, or a vector and a plane. The math notation is ⊥, as in x ⊥ y.

Orthonormality is an expanded concept, requiring that the components be unit vectors.


Vectors

The concept of orthogonality is a generalization of perpendicularity from 2-dimensional space.

Put simply, vectors a and b are proven to be orthogonal if their dot product is 0.

More precisely: assuming orthogonality, vectors a and b will satisfy the Pythagorean theorem. The hypotenuse is Euclidean distance: (a+b)T(a+b). Simplifying from there:

aTa + bTb = (a+b)T(a+b)

aTa + bTb = aTa + bTb + aTb + bTa

0 = aTb + bTa

0 = 2(aTb)

0 = aTb


Subspaces

For a subspace S to be orthogonal to a subspace T, every vector in S must be orthogonal to every vector in T.


Vectors and Planes

The null space of a matrix A contains the vectors that are not in the row space. These vectors cancel out; they are not a linear combination of the rows; if the row space is a plane, then these vectors are not on that plane.

The null space of A (a.k.a. N(A)) is orthogonal to the row space of A (a.k.a. R(A)). The null space of AT (a.k.a. N(AT)) is orthogonal to the column space of A (a.k.a. C(A)). Commonly this means that the row and column spaces are planes while the null spaces of A and AT are vectors, but that isn't always true.


Matrices

If a matrix is composed of orthonormal columns, then it is a matrix with orthonormal columns. These are usually denoted as Q. This has an important property: QTQ = I.

The projection of A if A is a matrix with orthonormal columns simplifies from P = A(ATA-1)AT into P = QQT. Correspondingly, the system of normal equations simplifies from ATAx̂ = ATb into x̂ = QTb.

If a matrix with orthonormal columns is also square, only then can it be called an orthogonal matrix. This has an additional important property: QT = Q-1.

For example, if Q is square, then the projection matrix further simplifies to P = I.


CategoryRicottone

LinearAlgebra/Orthogonality (last edited 2025-03-28 03:10:35 by DominicRicottone)