Size: 2060
Comment: Simplify
|
← Revision 24 as of 2025-01-10 14:33:38 ⇥
Size: 2156
Comment: Killing Econometrics page
|
Deletions are marked like this. | Additions are marked like this. |
Line 23: | Line 23: |
The derivation can be seen [[Econometrics/OrdinaryLeastSquares/Univariate|here]]. | The derivation can be seen [[Statistics/OrdinaryLeastSquares/Univariate|here]]. |
Line 43: | Line 43: |
The derivation can be seen [[Econometrics/OrdinaryLeastSquares/Multivariate|here]]. | The derivation can be seen [[Statistics/OrdinaryLeastSquares/Multivariate|here]]. |
Line 51: | Line 51: |
If these assumptions can be made: | The '''Gauss-Markov theorem''' demonstrates that (with some assumptions) the OLS estimations are the '''best linear unbiased estimators''' ('''BLUE''') for the regression coefficients. The assumptions are: |
Line 54: | Line 54: |
2. [[Econometrics/Exogeneity|Exogeneity]] | 2. [[Statistics/Exogeneity|Exogeneity]] |
Line 56: | Line 56: |
4. No perfect multicolinearity 5. [[Econometrics/Homoskedasticity|Homoskedasticity]] Then OLS is the best linear unbiased estimator ('''BLUE''') for regression coefficients. |
4. No perfect [[LinearAlgebra/Basis|multicolinearity]] 5. [[Statistics/Homoskedasticity|Homoskedasticity]] |
Ordinary Least Squares
Ordinary Least Squares (OLS) is a linear regression method. It minimizes root mean square errors.
Univariate
Given one independent variable and one dependent (outcome) variable, the OLS model is specified as:
It is estimated as:
This model describes (1) the mean observation and (2) the marginal changes to the outcome per unit changes in the independent variable.
The derivation can be seen here.
Multivariate
Given k independent variables, the OLS model is specified as:
It is estimated as:
More conventionally, this is estimated with linear algebra as:
The derivation can be seen here.
Estimated Coefficients
The Gauss-Markov theorem demonstrates that (with some assumptions) the OLS estimations are the best linear unbiased estimators (BLUE) for the regression coefficients. The assumptions are:
- Linearity
- Random sampling
No perfect multicolinearity
The variances for each coefficient are:
Note that the standard deviation of the population's parameter is unknown, so it's estimated like:
If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually:
Wherein, for example, r1j is the residual from regressing x1 onto x2, ... xk.
The variances for each coefficient can be estimated with the Eicker-White formula:
See Nicolai Kuminoff's video lectures for the derivation of the robust estimators.