= Orthogonality = '''Orthogonality''' is a generalization of perpendicularity. '''Orthonormality''' is a related concept, requiring that the components be [[Calculus/UnitVector|unit vectors]]. See also [[Calculus/Orthogonality|vector orthogonality]]. <> ---- == Description == '''Orthogonality''' is an extension of perpendicularity to higher dimensions of Euclidean space, and also to arbitrary [[LinearAlgebra/InnerProduct|inner product spaces]]. To notate that ''x'' is orthogonal to ''y'', use ''x ⊥ y''. '''Orthonormality''' is a further constraint: the orthogonal vectors are also [[Calculus/UnitVector|unit vectors]]. For a two vectors (which are ''not'' the zero vector) to be orthogonal, their inner product should be equal to 0. For a set of vectors to be orthogonal, every possible pair from the set should be orthogonal. === Matrices === A matrix is effectively a set of vectors. If the columns are orthonormal, then it can be called a '''matrix with orthonormal columns'''. These are usually denoted as '''''Q'''''. Matrices with orthonormal columns have several important properties: * '''''Q'''^T^'''Q''' = '''I''''' * The [[LinearAlgebra/Projection|projection matrix]] is given as '''''P''' = '''QQ'''^T^''. The second follows from the first. Recall that, when projecting ''b'' into ''C('''A''')'', the projection matrix is given as '''''P''' = '''A'''('''A'''^T^'''A'''^-1^)'''A'''^T^''. This comes from the linear system '''''A'''^T^'''A'''x̂ = '''A'''^T^b'' and requiring that ''p = '''P'''b''. For a matrix '''''Q''''' with orthonormal columns, the first property simplifies the linear system to ''x̂ = '''Q'''^T^b''. Therefore, '''''P''' = '''QQ'''^T^''. If and ''only'' if such a matrix is square, it can be called an '''orthogonal matrix'''. These have several further properties: * '''''Q'''^T^'''Q''' = '''QQ'''^T^ = '''I''''' * '''''Q'''^T^ = '''Q'''^-1^'' * The [[LinearAlgebra/Determinant|determinant]] is always 1 or -1 * The projection matrix is given as '''''P''' = '''I''''', indicating that ''b'' must be in ''C('''A''')''. === Subspaces === For two subspaces to be orthogonal, every vector in the span of one should be orthogonal to every vector in the span of the other. For example, consider a plane in ''R^3^''. A plane is a subspace spanned by 2 vectors. The subspace that is orthogonal to a plane must be spanned by 1 vector, i.e. it is a line. A plane and a line can be checked for orthogonality by comparing each of the vectors spanning the plane for orthogonality with the single vector spanning the line. For another example, [[LinearAlgebra/NullSpace|null spaces]] are orthogonal by definiton. For any subspace '''''A''''', the vectors spanning ''N('''A''')'' are precisely those that are not in ''R('''A''')'', therefore they are orthogonal. Similarly, the vectors spanning ''N('''A'''^T^)'' are precisely those that are not in ''C('''A''')'', therefore they are orthogonal. ---- CategoryRicottone