|
Size: 3626
Comment: More notes
|
Size: 3231
Comment: Rewrite
|
| Deletions are marked like this. | Additions are marked like this. |
| Line 15: | Line 15: |
| Euclidean distance is commonly formulated as... | ''Squared'' Euclidean distance is commonly formulated as... |
| Line 21: | Line 21: |
| Note that this is equivalent to ''x^T^'''I'''^T^'''I'''x''. A change of basis can be effected by swapping the [[LinearAlgebra/SpecialMatrices#Identity_Matrix|identity matrix]] with some other '''''A''''': ''x^T^'''A'''^T^'''A'''x''. | Never forget to take the square root! Note that this is equivalent to ''x^T^'''I'''x''. A change of basis can be affected by swapping the [[LinearAlgebra/SpecialMatrices#Identity_Matrix|identity matrix]] with some other '''''A'''^-1^'' (so notated because the motivation is generally that there is some other linear transformation '''''A''''' that pre-exists, and needs to be undone). The ''squared'' Mahalanobis distance is then calculated as... * given a vector x⃗ and the origin as a reference point, ''x⃗^T^'''A'''^-1^x⃗''. * given two vectors x⃗ and y⃗, ''(x⃗-y⃗)^T^'''A'''^-1^(x⃗-y⃗)''. * Let ''z⃗ = x⃗ - y⃗'', so ''(x⃗-y⃗)^T^'''A'''^-1^(x⃗-y⃗) = z⃗^T^'''A'''^-1^z⃗''. * given a column ''x'' and a column of population means as ''μ'', ''(x-μ)^T^'''A'''^-1^(x-μ)''. Again, never forget to take the square root! |
| Line 43: | Line 53: |
| Mahalanobis distances are appropriate for calculating variance-normalized distances, as for [[Statistics/TestStatistic|test statistics]]. The change of [[LinearAlgebra/Basis|basis]] is established by the '''covariance matrix''', notated as '''''Σ'''''. More specifically, using the '''standard deviation matrix''' (''√'''Σ''' = '''Σ'''^0.5^''). | Mahalanobis distances are appropriate for calculating variance-normalized distance under a multivariate distribution, as for [[Statistics/TestStatistic|test statistics]]. The change of [[LinearAlgebra/Basis|basis]] is established by the [[Statistics/Covariance#Matrix|inverse covariance matrix]], notated as '''''Σ'''^-1^''. |
| Line 45: | Line 55: |
| The variance-normalized distance from a distribution to an estimate in a single dimension can be calculated with, e.g., the Z-statistic: ''(x̂-μ,,X,,)/σ,,X,,''. (Henceforward measurements are normalized: ''x = x̂-μ,,X,,''.) This can be repeated for any number of dimensions. If variance is unit and independent across dimensions, then the joint distance from the multivariate distribution can be calculated (for two dimensions) like: ''√(x^T^x + y^T^y) = √((x-y)^T^(x-y))''. But variances are not unit and do correlate, as described by the covariate matrix. The change of basis must 'undo' this distribution, ergo the inverse of the standard deviation matrix (''√('''Σ'''^-1^) = '''Σ'''^-0.5^'') should be used for '''''A'''''. | |
| Line 47: | Line 56: |
| Note that a covariance matrix is... * square * [[LinearAlgebra/SpecialMatrices#Symmetric_Matrices|symmetric]], so '''''Σ'''^T^ = '''Σ''''' * [[LinearAlgebra/PositiveDefiniteness|positive semi-definite]], so... * '''''Σ'''^0.5^'' can always be evaluated * the [[LinearAlgebra/Determinant|determinant]] is bound by ''|'''Σ'''| >= 0'', so... * either ''|'''Σ'''| = 0'' or '''''Σ''''' is [[LinearAlgebra/Invertibility|invertible]] |
|
| Line 55: | Line 57: |
| After substitution, using the symmetric rule, and simplifying exponents through the product rule, '''''A'''^T^'''A''''' becomes '''''Σ'''^-1^''. In summary, the variance-normalized distance is calculated like: ''√((x-y)^T^'''Σ'''^-1^(x-y))'' | === Normalized Euclidean distance === Using a diagonal matrix of variance terms ignores correlations between the terms. It is effectively an assumption of [[Statistics/JointProbability#Independence|independence]]. Despite not being true Mahalanobis distance, there are still some utilities to this calculation. The [[Stata/Mahapick|mahascore]] documentation calls this metric 'normalized Euclidean distance'. |
Mahalanobis Distance
Mahalanobis distance is a Euclidean distance that is transformed through a change of basis to normalize variance.
Description
Mahalanobis distance is equivalent to Euclidean distance with a change in basis.
Squared Euclidean distance is commonly formulated as...
given a vector x⃗ and the origin as a reference point, x⃗Tx⃗.
given two vectors x⃗ and y⃗, (x⃗-y⃗)T(x⃗-y⃗).
Let z⃗ = x⃗ - y⃗, so (x⃗-y⃗)T(x⃗-y⃗) = z⃗Tz⃗.
given a column x and a column of population means as μ, (x-μ)T(x-μ).
Never forget to take the square root!
Note that this is equivalent to xTIx. A change of basis can be affected by swapping the identity matrix with some other A-1 (so notated because the motivation is generally that there is some other linear transformation A that pre-exists, and needs to be undone).
The squared Mahalanobis distance is then calculated as...
given a vector x⃗ and the origin as a reference point, x⃗TA-1x⃗.
given two vectors x⃗ and y⃗, (x⃗-y⃗)TA-1(x⃗-y⃗).
Let z⃗ = x⃗ - y⃗, so (x⃗-y⃗)TA-1(x⃗-y⃗) = z⃗TA-1z⃗.
given a column x and a column of population means as μ, (x-μ)TA-1(x-μ).
Again, never forget to take the square root!
Properties
Mahalanobis distance is invariant under non-singular linear transformations. Let Y1 = a + bX1 and Y2 = a + bX2, and suppose that b is non-singular. Then dM(Y1,Y2) = dM(X1,X2).
Geometry
In a two-dimensional graph, plotting the points with a Euclidean distance of 1 around the origin results in a unit circle. The change of basis described by A transforms the circle into an ellipsoid.
Note that if A is diagonal, the ellipsoid will be axis-aligned (i.e., appear to be stretched along the x or y axes).
Usage
Mahalanobis distances are appropriate for calculating variance-normalized distance under a multivariate distribution, as for test statistics. The change of basis is established by the inverse covariance matrix, notated as Σ-1.
Normalized Euclidean distance
Using a diagonal matrix of variance terms ignores correlations between the terms. It is effectively an assumption of independence. Despite not being true Mahalanobis distance, there are still some utilities to this calculation.
The mahascore documentation calls this metric 'normalized Euclidean distance'.
