|
Size: 2040
Comment:
|
Size: 1985
Comment: Simplifying matrix page names
|
| Deletions are marked like this. | Additions are marked like this. |
| Line 3: | Line 3: |
| '''Gram-Schmidt orthonormalization''' is a process for making vectors into orthonormal vectors. It is generalized as '''''A''' = '''QR'''''. | '''Gram-Schmidt orthonormalization''' is a process for making vectors into orthonormal vectors. |
| Line 13: | Line 13: |
| [[LinearAlgebra/Orthogonality|Orthogonality]] is fundamentally about the relation between two vectors. Normalization simply means scaling them to unit vectors (by dividing the vector by it's squared length). So as the first point of reference, ''a'' needs no transformation. It is automatically denoted as the orthogonal vector ''A''. | Two vectors ''a'' and ''b'' can be orthonormalized into ''A'' and ''B''. |
| Line 15: | Line 15: |
| The process of transforming vector ''b'' into orthogonal vector ''B'' is simply the subtraction of all components of ''a'' from ''b''. This is a linear combination and does not change the column space of a system that includes both ''a'' and ''b''. [[LinearAlgebra/Projections#Vectors|Projections]] are a complimentary idea; ''p'' is the component of ''a'' that estimates ''b''. The process of '''orthonormalization''' is the same as computing projections but the error term ''e'' is the desired result. Recall that ''e = b - ax̂'' and ''x̂ = (A^T^b)/(A^T^A)''. | [[Calculus/Orthogonality|Orthogonality]] is a property of two vectors, not one. Therefore ''a'' needs no transformation and becomes ''A''. |
| Line 17: | Line 17: |
| Therefore, ''B = b - A (A^T^b)/(A^T^A)''. | The process of transforming ''b'' into ''B'' is simply the subtraction of all components of ''a'' from ''b''. This is a linear combination and does not change the column space of a system that includes both ''a'' and ''b''. [[LinearAlgebra/Projection|Projections]] are a complimentary idea; ''p'' is the component of ''a'' that estimates ''b''. The process of '''orthonormalization''' is the same as computing projections but the error term ''e'' is the desired result. Recall that ''e = b - ax̂'' and ''x̂ = (A^T^b)/(A^T^A)''. Therefore, ''B = b - A (A^T^b)/(A^T^A)''. |
| Line 21: | Line 21: |
| The orthogonal vectors are then normalized as ''A/||A||'' and ''B/||B||''. | The orthogonal vectors are then normalized by scaling to their [[Calculus/Distance#Euclidean_distance|Euclidean distances]], as ''A/||A||'' and ''B/||B||''. |
| Line 33: | Line 33: |
| Note that '''''Q''''' is a [[LinearAlgebra/Orthogonality#Matrices|matrix with orthonormal columns]], not necessarily an '''orthogonal matrix'''. | To re-emphasize, this is a linear combination generalized as '''''A''' = '''QR''''', and does not change the column space of '''''A'''''. |
| Line 35: | Line 35: |
| To re-emphasize, this is a linear combination and does not change the column space. | Note that '''''Q''''' is a [[LinearAlgebra/Orthogonality|matrix with orthonormal columns]]; it must also be square to be called an '''orthogonal matrix'''. |
Orthonormalization
Gram-Schmidt orthonormalization is a process for making vectors into orthonormal vectors.
Contents
Vectors
Two vectors a and b can be orthonormalized into A and B.
Orthogonality is a property of two vectors, not one. Therefore a needs no transformation and becomes A.
The process of transforming b into B is simply the subtraction of all components of a from b. This is a linear combination and does not change the column space of a system that includes both a and b. Projections are a complimentary idea; p is the component of a that estimates b. The process of orthonormalization is the same as computing projections but the error term e is the desired result. Recall that e = b - ax̂ and x̂ = (ATb)/(ATA). Therefore, B = b - A (ATb)/(ATA).
To transform another vector c into being orthogonal to both A and B, apply the same process for each component: C = c - A (ATc)/(ATA) - B (BTc)/(BTB).
The orthogonal vectors are then normalized by scaling to their Euclidean distances, as A/||A|| and B/||B||.
Matrices
The process applied to vectors is also applicable to the columns in a matrix. Instead of vectors a and b, use v1 and v2 in V. The process yields u1 and u2 in U. Then the columns are normalized into Q like q1 = u1/||u1||.
To re-emphasize, this is a linear combination generalized as A = QR, and does not change the column space of A.
Note that Q is a matrix with orthonormal columns; it must also be square to be called an orthogonal matrix.
