Differences between revisions 12 and 13
Revision 12 as of 2023-10-28 17:49:29
Size: 1660
Comment:
Revision 13 as of 2023-10-28 18:10:02
Size: 1653
Comment: Renamed attachments
Deletions are marked like this. Additions are marked like this.
Line 51: Line 51:
{{attachment:model3.svg}} {{attachment:model2.svg}}
Line 55: Line 55:
{{attachment:model4.svg}} {{attachment:homo1.svg}}
Line 57: Line 57:
Note also that the standard deviation of the population's parameter is unknown, so it's estimated like: Note that the standard deviation of the population's parameter is unknown, so it's estimated like:
Line 59: Line 59:
{{attachment:model6.svg}} {{attachment:homo2.svg}}

Ordinary Least Squares

Ordinary Least Squares (OLS) is a linear regression method. It minimizes root mean square errors.


Univariate

The regression line passes through two points:

[ATTACH]

and

[ATTACH]

These points, with the generic equation for a line, can prove that the slope of the regression line is equal to:

[ATTACH]

The generic formula for the regression line is:

[ATTACH]


Linear Model

The linear model can be expressed as:

model1.svg

If these assumptions can be made:

  1. Linearity
  2. Exogeneity

  3. Random sampling
  4. No perfect multicolinearity
  5. Homoskedasticity

Then OLS is the best linear unbiased estimator (BLUE) for these coefficients.

Using the computation above, the coefficients are estimated to produce:

model2.svg

The variances for each coefficient are:

homo1.svg

Note that the standard deviation of the population's parameter is unknown, so it's estimated like:

homo2.svg

If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually:

hetero1.svg

It follows that the variances for each coefficient are:

hetero2.svg

These variances can be estimated with the Eicker-White formula:

[ATTACH]


CategoryRicottone

Statistics/OrdinaryLeastSquares (last edited 2025-01-10 14:33:38 by DominicRicottone)