Differences between revisions 10 and 19 (spanning 9 versions)
Revision 10 as of 2023-10-28 17:39:22
Size: 1576
Comment: Heteroskedasticity
Revision 19 as of 2024-06-05 22:01:56
Size: 1860
Comment: Rewrite 2
Deletions are marked like this. Additions are marked like this.
Line 13: Line 13:
The regression line passes through two points: Given one independent variable and one dependent (outcome) variable, the OLS model is specified as:
Line 15: Line 15:
{{attachment:regression1.svg}} {{attachment:model.svg}}
Line 17: Line 17:
and It is estimated as:
Line 19: Line 19:
{{attachment:regression2.svg}} {{attachment:estimate.svg}}
Line 21: Line 21:
These points, with the generic equation for a line, can [[Econometrics/OrdinaryLeastSquares/UnivariateProof|prove]] that the slope of the regression line is equal to: This model describes (1) the mean observation and (2) the marginal changes to the outcome per unit changes in the independent variable.
Line 23: Line 23:
{{attachment:b12.svg}}

The generic formula for the regression line is:

{{attachment:b13.svg}}
The proof can be seen [[Econometrics/OrdinaryLeastSquares/UnivariateProof|here]].
Line 33: Line 29:
== Linear Model == == Multivariate ==
Line 35: Line 31:
The linear model can be expressed as: Given ''k'' independent variables, the OLS model is specified as:
Line 37: Line 33:
{{attachment:model1.svg}} {{attachment:mmodel.svg}}

It is estimated as:

{{attachment:mestimate.svg}}

----



== Estimated Coefficients ==
Line 47: Line 53:
Then OLS is the best linear unbiased estimator ('''BLUE''') for these coefficients. Then OLS is the best linear unbiased estimator ('''BLUE''') for regression coefficients.
Line 49: Line 55:
Using the computation above, the coefficients are estimated to produce: The variances for each coefficient are:
Line 51: Line 57:
{{attachment:model3.svg}} {{attachment:homo1.svg}}
Line 53: Line 59:
The variances for each coefficient is estimated as: Note that the standard deviation of the population's parameter is unknown, so it's estimated like:
Line 55: Line 61:
{{attachment:model4.svg}}

Note also that the standard deviation of the population's parameter is unknown, so it's estimated like:

{{attachment:model6.svg}}
{{attachment:homo2.svg}}
Line 65: Line 67:
And the variances for each coefficient are estimated as: Wherein, for example, ''r,,1j,,'' is the residual from regressing ''x,,1,,'' onto ''x,,2,,'', ... ''x,,k,,''.

The variances for each coefficient can be estimated with the Eicker-White formula:
Line 68: Line 72:

See [[https://www.youtube.com/@kuminoff|Nicolai Kuminoff's]] video lectures for the derivation of the robust estimators.

Ordinary Least Squares

Ordinary Least Squares (OLS) is a linear regression method. It minimizes root mean square errors.


Univariate

Given one independent variable and one dependent (outcome) variable, the OLS model is specified as:

model.svg

It is estimated as:

estimate.svg

This model describes (1) the mean observation and (2) the marginal changes to the outcome per unit changes in the independent variable.

The proof can be seen here.


Multivariate

Given k independent variables, the OLS model is specified as:

mmodel.svg

It is estimated as:

mestimate.svg


Estimated Coefficients

If these assumptions can be made:

  1. Linearity
  2. Exogeneity

  3. Random sampling
  4. No perfect multicolinearity
  5. Homoskedasticity

Then OLS is the best linear unbiased estimator (BLUE) for regression coefficients.

The variances for each coefficient are:

[ATTACH]

Note that the standard deviation of the population's parameter is unknown, so it's estimated like:

[ATTACH]

If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually:

[ATTACH]

Wherein, for example, r1j is the residual from regressing x1 onto x2, ... xk.

The variances for each coefficient can be estimated with the Eicker-White formula:

[ATTACH]

See Nicolai Kuminoff's video lectures for the derivation of the robust estimators.


CategoryRicottone

Statistics/OrdinaryLeastSquares (last edited 2025-09-03 02:08:40 by DominicRicottone)