Differences between revisions 1 and 5 (spanning 4 versions)
Revision 1 as of 2024-06-05 22:43:56
Size: 1599
Comment: Partial
Revision 5 as of 2024-06-05 23:20:26
Size: 816
Comment: Simplify
Deletions are marked like this. Additions are marked like this.
Line 1: Line 1:
= Ordinary Least Squares Univariate Proof = ## page was renamed from Econometrics/OrdinaryLeastSquares/MultivariateProof
= Ordinary Least Squares Multivariate Proof =
Line 7: Line 8:
The model is fit by a minimization problem: where:
Line 9: Line 10:
{{attachment:min.svg}}

----
 * ''y'' and ''ε'' are vectors of size ''n''
 * ''β'' is a vector of size ''p''
 * '''''X''''' is a matrix of shape ''n'' by ''p''
Line 17: Line 18:
This line must pass through the mean and the slope of the line must be the marginal change in ''Y'' given a unit change in ''X''. In other words, the line must pass through two points: with the constraint:
Line 21: Line 22:
where: The [[Calculus/PartialDerivatives|partial derivative]] of this constraint with respect to ''b'' is calculated like:
Line 23: Line 24:
 * ''X‾'' is the sample mean of ''X'' (estimating ''μ,,X,,'')
 * ''Y‾'' is the sample mean of ''Y'' (estimating ''μ,,Y,,'')
 * ''s,,X,,'' is the sample standard deviation of ''X'' (estimating ''σ,,X,,'')
 * ''s,,Y,,'' is the sample standard deviation of ''Y'' (estimating ''σ,,Y,,'')
 * and ''r,,XY,,'' is the sample correlation coefficient between ''X'' and ''Y'' (estimating ''ρ,,XY,,'')
{{attachment:b1.svg}}
Line 29: Line 26:
Insert the first point into the estimation. This is quickly solved for ''α''. {{attachment:b2.svg}}
Line 31: Line 28:
{{attachment:alpha1.svg}} Set this derivative to 0 to find the minimum with respect to ''b''.
Line 33: Line 30:
{{attachment:alpha2.svg}} {{attachment:b3.svg}}
Line 35: Line 32:
Insert the second point and the solution for ''α'' into the estimation. {{attachment:b4.svg}}
Line 37: Line 34:
{{attachment:beta1.svg}} {{attachment:b5.svg}}
Line 39: Line 36:
{{attachment:beta2.svg}} {{attachment:b6.svg}}
Line 41: Line 38:
{{attachment:beta3.svg}}

This reduced form can be quickly solved for ''β''.

{{attachment:beta4.svg}}

Because the correlation coefficient can be expressed in terms of covariance and standard deviations...

{{attachment:correlation.svg}}

...the solution for ''β'' can be further reduced.

{{attachment:beta5.svg}}

Therefore, the regression line is estimated to be:

{{attachment:regression.svg}}

Ordinary Least Squares Multivariate Proof

The model is constructed like:

model1.svg

where:

  • y and ε are vectors of size n

  • β is a vector of size p

  • X is a matrix of shape n by p

This is estimated as:

model2.svg

with the constraint:

model3.svg

The partial derivative of this constraint with respect to b is calculated like:

b1.svg

b2.svg

Set this derivative to 0 to find the minimum with respect to b.

b3.svg

b4.svg

b5.svg

b6.svg


CategoryRicottone

Statistics/OrdinaryLeastSquares/Multivariate (last edited 2025-01-10 14:34:03 by DominicRicottone)