|
Size: 2590
Comment: Bold
|
← Revision 20 as of 2026-02-16 16:43:48 ⇥
Size: 2753
Comment: Fixed link
|
| Deletions are marked like this. | Additions are marked like this. |
| Line 1: | Line 1: |
| = Projections = | = Projection = |
| Line 3: | Line 3: |
| When two vectors do not exist in the same column space, the best approximation of one in the other's columns space is called a '''projection'''. | A '''projection''' is an approximation within a column space. |
| Line 11: | Line 11: |
| Line 13: | Line 14: |
| Given two vectors ''a'' and ''b'', we can '''project''' ''b'' onto ''a'' to get the best possible estimate of the former as a multiple of the latter. This projection ''p'' has an error term ''e''. | Given two vectors ''a'' and ''b'', ''b'' can be [[Calculus/Projection|projected]] into ''C(a)'', the column space of ''a''. |
| Line 15: | Line 16: |
| Take the multiple as ''x'', so that ''p = ax''. The error term can be characterized as ''b-p'' or ''b-ax''. | Furthermore, the projection vector with the least [[Calculus/Distance#Euclidean_distance|error]] as compared to the true vector ''b'' is characterized by orthogonality. Let ''e'' be the error vector; it is orthogonal to ''a''. |
| Line 17: | Line 18: |
| ''a'' is [[LinearAlgebra/Orthogonality|orthogonal]] to ''e''. Therefore, ''a^T^(b-ax) = 0''. This simplifies to ''x = (a^T^b)/(a^T^a)''. Altogether, the projection is characterized as ''p = a(a^T^b)/(a^T^a)''. | The [[LinearAlgebra/LinearMapping|transformation]] of vector ''b'' into projection vector ''p'' can be described by a '''projection matrix'''. It is notated '''''P''''' as in ''p = '''P'''b''. |
| Line 19: | Line 20: |
| A matrix '''''P''''' can be defined such that ''p = '''P'''b''. The '''projection matrix''' is ''(aa^T^)/(a^T^a)''. The column space of '''''P''''' (a.k.a. ''C('''P''')'') is the line through ''a'', and its rank is 1. | |
| Line 21: | Line 21: |
| Incidentally, '''''P''''' is symmetric (i.e. '''''P'''^T^ = '''P''''') and re-projecting does not change the result (i.e. '''''P'''^2^ = '''P'''''). | === Properties === The projection matrix '''''P''''' satisfying ''p = '''P'''b'' is [[LinearAlgebra/Rank|rank]] 1. ''C('''P''')'', the column space of the projection matrix, is equivalent to ''C(b)''. Projection matrices are [[LinearAlgebra/SpecialMatrices#Symmetric_Matrices|symmetric]] (i.e., '''''P'''^T^ = '''P''''') and [[LinearAlgebra/Idempotency|idempotent]] (i.e., '''''P'''^2^ = '''P'''''). |
| Line 29: | Line 36: |
| For problems like '''''A'''x = b'' where there is no solution for ''x'', as in b does not exist in the column space of '''''A''''', we can instead solve '''''A'''x = p'' where ''p'' estimates ''b'' with an error term ''e''. | For all the same reasons, a vector ''b'' can be projected into ''C('''A''')'', the column space of '''''A'''''. The [[Calculus/Distance#Euclidean_distance|error]] vector ''e'' is orthogonal to ''R('''A''')'', the row space of '''''A''''', and is therefore in the [[LinearAlgebra/NullSpace|null space]]. |
| Line 31: | Line 38: |
| ''p'' is a linear combination of '''''A''''': if there are two columns ''a,,1,,'' and ''a,,2,,'', then ''p = x,,1,,a,,1,, + x,,2,,a,,2,,'' and ''b = x,,1,,a,,1,, + x,,2,,a,,2,, + e''. | The projection matrix is now notated '''''P''''' as in ''p = '''P'''b''. |
| Line 33: | Line 40: |
| ''e'' is orthogonal to the column space of '''''A'''^T^'' (a.k.a. ''C('''A'''^T^)''), so '''''A'''^T^(b-'''A'''x) = 0''. Concretely in the same example, ''a,,1,,^T^(b-'''A'''x) = 0'' and ''a,,2,,^T^(b-'''A'''x) = 0''. More generally, this re-emphasizes that ''e'' is orthogonal in the null space of '''''A'''^T^'' (a.k.a. ''N('''A'''^T^)''). | |
| Line 35: | Line 41: |
| The solution for this all is ''x = ('''A'''^T^'''A''')^-1^'''A'''^T^b''. That also means that ''p = '''A'''('''A'''^T^'''A''')^-1^'''A'''^T^b''. | |
| Line 37: | Line 42: |
| A matrix '''''P''''' can be defined such that ''p = '''P'''b''. The '''projection matrix''' is '''A'''('''A'''^T^'''A''')^-1^'''A'''^T^. | === Least Squares === |
| Line 39: | Line 44: |
| Note that if '''''A''''' were a square matrix, most of the above equations would [[LinearAlgebra/MatrixInversion|cancel out]]. But we cannot make that assumption. This fundamentally means though that if ''b'' were in the column space of '''''A''''', then '''''P''''' would be the identity matrix. | Given a consistent system as '''''A'''x = b'', i.e. ''b'' is in ''C('''A''')'', there are solutions for ''x''. |
| Line 41: | Line 46: |
| [[Econometrics/OrdinaryLeastSquares|This should look familiar.]] | If the system is inconsistent, then there is no solution. The best approximation is expressed as '''''A'''x̂ = p'' where projection ''p'' estimates ''b'' with an error term ''e''. [[Statistics/OrdinaryLeastSquares|This should sound familiar.]] The error term can be generally characterized by ''e = b - p''. An expression for ''p'' is known, so ''e = b - '''A'''x̂''. ''e'' is orthogonal to ''R('''A''')'', so '''''A'''^T^e = 0''. Substituting in the above expression gives '''''A'''^T^(b - '''A'''x̂) = 0'' Altogether, '''''A'''^T^'''A'''x̂ = '''A'''^T^b'' which simplifies to ''x̂ = ('''A'''^T^'''A''')^-1^'''A'''^T^b''. The projection matrix is calculated as '''''P''' = '''A'''('''A'''^T^'''A''')^-1^'''A'''^T^''. The projection is calculated as ''p = '''A'''('''A'''^T^'''A''')^-1^'''A'''^T^b''. === Properties === Projection matrices are still symmetric and idempotent. If ''b'' is in ''C('''A''')'', then '''''P''' = '''I'''''.. Conversely, if ''b'' is orthogonal to ''C('''A''')'', then '''''P'''b = 0'' and ''b = e''. |
Projection
A projection is an approximation within a column space.
Vectors
Given two vectors a and b, b can be projected into C(a), the column space of a.
Furthermore, the projection vector with the least error as compared to the true vector b is characterized by orthogonality. Let e be the error vector; it is orthogonal to a.
The transformation of vector b into projection vector p can be described by a projection matrix. It is notated P as in p = Pb.
Properties
The projection matrix P satisfying p = Pb is rank 1.
C(P), the column space of the projection matrix, is equivalent to C(b).
Projection matrices are symmetric (i.e., PT = P) and idempotent (i.e., P2 = P).
Matrices
For all the same reasons, a vector b can be projected into C(A), the column space of A. The error vector e is orthogonal to R(A), the row space of A, and is therefore in the null space.
The projection matrix is now notated P as in p = Pb.
Least Squares
Given a consistent system as Ax = b, i.e. b is in C(A), there are solutions for x.
If the system is inconsistent, then there is no solution. The best approximation is expressed as Ax̂ = p where projection p estimates b with an error term e. This should sound familiar.
The error term can be generally characterized by e = b - p. An expression for p is known, so e = b - Ax̂.
e is orthogonal to R(A), so ATe = 0. Substituting in the above expression gives AT(b - Ax̂) = 0
Altogether, ATAx̂ = ATb which simplifies to x̂ = (ATA)-1ATb.
The projection matrix is calculated as P = A(ATA)-1AT.
The projection is calculated as p = A(ATA)-1ATb.
Properties
Projection matrices are still symmetric and idempotent.
If b is in C(A), then P = I.. Conversely, if b is orthogonal to C(A), then Pb = 0 and b = e.
