Differences between revisions 17 and 23 (spanning 6 versions)
Revision 17 as of 2024-06-05 14:58:24
Size: 1809
Comment: Simplify language
Revision 23 as of 2025-01-10 14:33:11
Size: 2224
Comment: Killing Econometrics page
Deletions are marked like this. Additions are marked like this.
Line 1: Line 1:
## page was renamed from Econometrics/OrdinaryLeastSquares
Line 13: Line 14:
The regression line passes through two points: Given one independent variable and one dependent (outcome) variable, the OLS model is specified as:
Line 15: Line 16:
{{attachment:regression1.svg}} {{attachment:model.svg}}
Line 17: Line 18:
and It is estimated as:
Line 19: Line 20:
{{attachment:regression2.svg}} {{attachment:estimate.svg}}
Line 21: Line 22:
It can be [[Econometrics/OrdinaryLeastSquares/UnivariateProof|proven]] that the slope of the regression line is equal to: This model describes (1) the mean observation and (2) the marginal changes to the outcome per unit changes in the independent variable.
Line 23: Line 24:
{{attachment:b12.svg}}

The generic formula for the regression line is:

{{attachment:b13.svg}}
The derivation can be seen [[Econometrics/OrdinaryLeastSquares/Univariate|here]].
Line 35: Line 32:
Given ''k'' independent variables, the OLS model is specified as:

{{attachment:mmodel.svg}}

It is estimated as:

{{attachment:mestimate.svg}}

More conventionally, this is estimated with [[LinearAlgebra|linear algebra]] as:

{{attachment:matrix.svg}}

The derivation can be seen [[Econometrics/OrdinaryLeastSquares/Multivariate|here]].
Line 39: Line 50:
== Linear Model == == Estimated Coefficients ==
Line 41: Line 52:
The linear model can be expressed as:

{{attachment:model1.svg}}

If these assumptions can be made:
The '''Gauss-Markov theorem''' demonstrates that (with some assumptions) the OLS estimations are the '''best linear unbiased estimators''' ('''BLUE''') for the regression coefficients. The assumptions are:
Line 50: Line 57:
 4. No perfect multicolinearity  4. No perfect [[LinearAlgebra/Basis|multicolinearity]]
Line 52: Line 59:

Then OLS is the best linear unbiased estimator ('''BLUE''') for these coefficients.

Using the computation above, the coefficients are estimated to produce:

{{attachment:model2.svg}}

Ordinary Least Squares

Ordinary Least Squares (OLS) is a linear regression method. It minimizes root mean square errors.


Univariate

Given one independent variable and one dependent (outcome) variable, the OLS model is specified as:

model.svg

It is estimated as:

estimate.svg

This model describes (1) the mean observation and (2) the marginal changes to the outcome per unit changes in the independent variable.

The derivation can be seen here.


Multivariate

Given k independent variables, the OLS model is specified as:

mmodel.svg

It is estimated as:

mestimate.svg

More conventionally, this is estimated with linear algebra as:

matrix.svg

The derivation can be seen here.


Estimated Coefficients

The Gauss-Markov theorem demonstrates that (with some assumptions) the OLS estimations are the best linear unbiased estimators (BLUE) for the regression coefficients. The assumptions are:

  1. Linearity
  2. Exogeneity

  3. Random sampling
  4. No perfect multicolinearity

  5. Homoskedasticity

The variances for each coefficient are:

[ATTACH]

Note that the standard deviation of the population's parameter is unknown, so it's estimated like:

[ATTACH]

If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually:

[ATTACH]

Wherein, for example, r1j is the residual from regressing x1 onto x2, ... xk.

The variances for each coefficient can be estimated with the Eicker-White formula:

[ATTACH]

See Nicolai Kuminoff's video lectures for the derivation of the robust estimators.


CategoryRicottone

Statistics/OrdinaryLeastSquares (last edited 2025-09-03 02:08:40 by DominicRicottone)