Differences between revisions 22 and 26 (spanning 4 versions)
Revision 22 as of 2024-06-07 14:49:00
Size: 2164
Comment: Rewrite of coefficients section
Revision 26 as of 2025-05-17 03:48:23
Size: 1689
Comment: Clarifications
Deletions are marked like this. Additions are marked like this.
Line 23: Line 23:
The derivation can be seen [[Econometrics/OrdinaryLeastSquares/Univariate|here]]. The derivation can be seen [[Statistics/OrdinaryLeastSquares/Univariate|here]].
Line 43: Line 43:
The derivation can be seen [[Econometrics/OrdinaryLeastSquares/Multivariate|here]]. The derivation can be seen [[Statistics/OrdinaryLeastSquares/Multivariate|here]].
Line 54: Line 54:
 2. [[Econometrics/Exogeneity|Exogeneity]]  2. Exogeneity, i.e. predictors are independent of the outcome and the error term
Line 57: Line 57:
 5. [[Econometrics/Homoskedasticity|Homoskedasticity]]  5. Homoskedasticity, i.e. error terms are constant across observations
Line 59: Line 59:
The variances for each coefficient are:

{{attachment:homo1.svg}}

Note that the standard deviation of the population's parameter is unknown, so it's estimated like:

{{attachment:homo2.svg}}

If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually:

{{attachment:hetero1.svg}}

Wherein, for example, ''r,,1j,,'' is the residual from regressing ''x,,1,,'' onto ''x,,2,,'', ... ''x,,k,,''.

The variances for each coefficient can be estimated with the Eicker-White formula:

{{attachment:hetero2.svg}}

See [[https://www.youtube.com/@kuminoff|Nicolai Kuminoff's]] video lectures for the derivation of the robust estimators.
#5 mostly comes into the estimation of [[Statistics/StandardErrors|standard errors]], and there are alternative estimators that are robust to heteroskedasticity.

Ordinary Least Squares

Ordinary Least Squares (OLS) is a linear regression method. It minimizes root mean square errors.


Univariate

Given one independent variable and one dependent (outcome) variable, the OLS model is specified as:

model.svg

It is estimated as:

estimate.svg

This model describes (1) the mean observation and (2) the marginal changes to the outcome per unit changes in the independent variable.

The derivation can be seen here.


Multivariate

Given k independent variables, the OLS model is specified as:

mmodel.svg

It is estimated as:

mestimate.svg

More conventionally, this is estimated with linear algebra as:

matrix.svg

The derivation can be seen here.


Estimated Coefficients

The Gauss-Markov theorem demonstrates that (with some assumptions) the OLS estimations are the best linear unbiased estimators (BLUE) for the regression coefficients. The assumptions are:

  1. Linearity
  2. Exogeneity, i.e. predictors are independent of the outcome and the error term
  3. Random sampling
  4. No perfect multicolinearity

  5. Homoskedasticity, i.e. error terms are constant across observations

#5 mostly comes into the estimation of standard errors, and there are alternative estimators that are robust to heteroskedasticity.


CategoryRicottone

Statistics/OrdinaryLeastSquares (last edited 2025-09-03 02:08:40 by DominicRicottone)