Differences between revisions 4 and 6 (spanning 2 versions)
Revision 4 as of 2025-10-06 20:03:48
Size: 2796
Comment: Rewrite
Revision 6 as of 2025-10-06 20:24:39
Size: 3128
Comment: Properties
Deletions are marked like this. Additions are marked like this.
Line 1: Line 1:
## page was renamed from Statistics/CovarianceMatrices
Line 97: Line 96:

=== Properties ===

A covariance matrix is necessarily square, [[LinearAlgebra/SpecialMatrices#Symmetric_Matrices|symmetric]], and [[LinearAlgebra/PositiveDefiniteness|positive semi-definite]].
 * '''''Σ''' = '''Σ'''^T^''
 * the [[LinearAlgebra/Determinant|determinant]] is bound by ''|'''Σ'''| >= 0''
 * '''''Σ'''^0.5^'' can always be evaluated


=== Linear Algebra ===

Covariance

Covariance is a measure of how much something varies with another. It is a generalization of variance: Var(X) = Cov(X,X).


Description

Covariance is calculated as:

Cov(X,Y) = E[(X - E[X])(Y - E[y])]

Covariance is related to correlation as:

Corr(X,Y) = Cov(X,Y)/σXσY

Letting be the mean of X, and letting be the mean of Y, the calculation becomes:

Cov(X,Y) = E[(X - X̅)(Y - Y̅)]

E[XY - X̅Y - XY̅ + X̅Y̅]

E[XY] - X̅E[Y] - E[X]Y̅ + X̅Y̅

E[XY] - X̅Y̅ - X̅Y̅ + X̅Y̅

E[XY] - X̅Y̅

This gives a trivial proof that independent variables have zero correlation and zero covariance. Necessarily E[XY] = E[X]E[Y], so E[XY] - X̅Y̅ = 0

In the context of linear algebra, the calculation is notated as:

Cov(X,Y) = E[(X - E[X])(Y - E[y])T]

Letting mX be the mean vector of X and mY be the mean vector of Y, the calculation becomes:

Cov(X,Y) = E[(X - mX)(Y - mY)T]

Properties

Covariance is symmetric: Cov(X,Y) = Cov(Y,X)


Transformations

Covariance linearly transforms with scalars.

Cov(aX,Y) = E[aXY] - E[aX]E[Y]

a E[XY] - a E[X]E[Y]

a (E[XY] - E[X]E[Y])

a Cov(X,Y)

Covariance is linear with inputs.

Cov(X+Y,Z) = E[(X+Y)Z] - E[X+Y]E[Z]

E[XZ+YZ] - E[X+Y]E[Z]

(E[XZ] + E[YZ]) - (E[X] + E[Y]) E[Z]

(E[XZ] + E[YZ]) - (E[X]E[Z] + E[Y]E[Z])

(E[XZ] - E[X]E[Z] + E[YZ] - E[Y]E[Z]

Cov(X,Z) + Cov(Y,Z)

This gives a trivial proof that constant additions cancel out.

Cov(a+X,Y) = Cov(X,Y) + Cov(a,Y) = Cov(X,Y) + 0

Altogether: Cov(a+bX,c+dY) = b d Cov(X,Y)


Matrix

A covariance matrix describes multivariate covariances. Cell (i,j) is the covariance of the ith variable with the jth variable. On the diagonal are variances (i.e., covariance of a variable with itself). The matrix is usually notated as Σ.

The inverse covariance matrix, Σ-1, is also called the precision matrix.

Properties

A covariance matrix is necessarily square, symmetric, and positive semi-definite.

  • Σ = ΣT

  • the determinant is bound by |Σ| >= 0

  • Σ0.5 can always be evaluated

Linear Algebra

The covariance matrix linearly transforms with the inputs.

Cov(AX,AY) = E[(AX - AmX)(AY - AmY)T]

E[A(X - mX)(Y - mY)TAT]

AE[(X - mX)(Y - mY)T]AT

AΣAT

Trivially, if the transformation is a scalar like aI:

aaIT

aΣa

a2Σ


CategoryRicottone

Statistics/Covariance (last edited 2025-11-03 01:25:49 by DominicRicottone)