Differences between revisions 15 and 16
Revision 15 as of 2023-10-28 18:25:46
Size: 1730
Comment:
Revision 16 as of 2023-10-28 19:02:09
Size: 1854
Comment:
Deletions are marked like this. Additions are marked like this.
Line 77: Line 77:
See [[https://www.youtube.com/@kuminoff|Nicolai Kuminoff's]] video lectures for the derivation of the robust estimators.

Ordinary Least Squares

Ordinary Least Squares (OLS) is a linear regression method. It minimizes root mean square errors.


Univariate

The regression line passes through two points:

[ATTACH]

and

[ATTACH]

These points, with the generic equation for a line, can prove that the slope of the regression line is equal to:

[ATTACH]

The generic formula for the regression line is:

[ATTACH]


Multivariate


Linear Model

The linear model can be expressed as:

model1.svg

If these assumptions can be made:

  1. Linearity
  2. Exogeneity

  3. Random sampling
  4. No perfect multicolinearity
  5. Homoskedasticity

Then OLS is the best linear unbiased estimator (BLUE) for these coefficients.

Using the computation above, the coefficients are estimated to produce:

model2.svg

The variances for each coefficient are:

homo1.svg

Note that the standard deviation of the population's parameter is unknown, so it's estimated like:

homo2.svg

If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually:

hetero1.svg

Wherein, for example, r1j is the residual from regressing x1 onto x2, ... xk.

The variances for each coefficient can be estimated with the Eicker-White formula:

hetero2.svg

See Nicolai Kuminoff's video lectures for the derivation of the robust estimators.


CategoryRicottone

Statistics/OrdinaryLeastSquares (last edited 2025-01-10 14:33:38 by DominicRicottone)