Differences between revisions 9 and 10
Revision 9 as of 2023-10-28 16:49:33
Size: 1408
Comment:
Revision 10 as of 2023-10-28 17:39:22
Size: 1576
Comment: Heteroskedasticity
Deletions are marked like this. Additions are marked like this.
Line 53: Line 53:
The variance for each coefficient is estimated as: The variances for each coefficient is estimated as:
Line 56: Line 56:

Where R^2^ is calculated as:

{{attachment:model5.svg}}
Line 65: Line 61:
If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually:

{{attachment:hetero1.svg}}

And the variances for each coefficient are estimated as:

{{attachment:hetero2.svg}}

Ordinary Least Squares

Ordinary Least Squares (OLS) is a linear regression method. It minimizes root mean square errors.


Univariate

The regression line passes through two points:

[ATTACH]

and

[ATTACH]

These points, with the generic equation for a line, can prove that the slope of the regression line is equal to:

[ATTACH]

The generic formula for the regression line is:

[ATTACH]


Linear Model

The linear model can be expressed as:

model1.svg

If these assumptions can be made:

  1. Linearity
  2. Exogeneity

  3. Random sampling
  4. No perfect multicolinearity
  5. Homoskedasticity

Then OLS is the best linear unbiased estimator (BLUE) for these coefficients.

Using the computation above, the coefficients are estimated to produce:

[ATTACH]

The variances for each coefficient is estimated as:

[ATTACH]

Note also that the standard deviation of the population's parameter is unknown, so it's estimated like:

[ATTACH]

If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually:

hetero1.svg

And the variances for each coefficient are estimated as:

hetero2.svg


CategoryRicottone

Statistics/OrdinaryLeastSquares (last edited 2025-01-10 14:33:38 by DominicRicottone)