|
Size: 1809
Comment: Simplify language
|
Size: 2224
Comment: Killing Econometrics page
|
| Deletions are marked like this. | Additions are marked like this. |
| Line 1: | Line 1: |
| ## page was renamed from Econometrics/OrdinaryLeastSquares | |
| Line 13: | Line 14: |
| The regression line passes through two points: | Given one independent variable and one dependent (outcome) variable, the OLS model is specified as: |
| Line 15: | Line 16: |
| {{attachment:regression1.svg}} | {{attachment:model.svg}} |
| Line 17: | Line 18: |
| and | It is estimated as: |
| Line 19: | Line 20: |
| {{attachment:regression2.svg}} | {{attachment:estimate.svg}} |
| Line 21: | Line 22: |
| It can be [[Econometrics/OrdinaryLeastSquares/UnivariateProof|proven]] that the slope of the regression line is equal to: | This model describes (1) the mean observation and (2) the marginal changes to the outcome per unit changes in the independent variable. |
| Line 23: | Line 24: |
| {{attachment:b12.svg}} The generic formula for the regression line is: {{attachment:b13.svg}} |
The derivation can be seen [[Econometrics/OrdinaryLeastSquares/Univariate|here]]. |
| Line 35: | Line 32: |
| Given ''k'' independent variables, the OLS model is specified as: {{attachment:mmodel.svg}} It is estimated as: {{attachment:mestimate.svg}} More conventionally, this is estimated with [[LinearAlgebra|linear algebra]] as: {{attachment:matrix.svg}} The derivation can be seen [[Econometrics/OrdinaryLeastSquares/Multivariate|here]]. |
|
| Line 39: | Line 50: |
| == Linear Model == | == Estimated Coefficients == |
| Line 41: | Line 52: |
| The linear model can be expressed as: {{attachment:model1.svg}} If these assumptions can be made: |
The '''Gauss-Markov theorem''' demonstrates that (with some assumptions) the OLS estimations are the '''best linear unbiased estimators''' ('''BLUE''') for the regression coefficients. The assumptions are: |
| Line 50: | Line 57: |
| 4. No perfect multicolinearity | 4. No perfect [[LinearAlgebra/Basis|multicolinearity]] |
| Line 52: | Line 59: |
Then OLS is the best linear unbiased estimator ('''BLUE''') for these coefficients. Using the computation above, the coefficients are estimated to produce: {{attachment:model2.svg}} |
Ordinary Least Squares
Ordinary Least Squares (OLS) is a linear regression method. It minimizes root mean square errors.
Univariate
Given one independent variable and one dependent (outcome) variable, the OLS model is specified as:
It is estimated as:
This model describes (1) the mean observation and (2) the marginal changes to the outcome per unit changes in the independent variable.
The derivation can be seen here.
Multivariate
Given k independent variables, the OLS model is specified as:
It is estimated as:
More conventionally, this is estimated with linear algebra as:
The derivation can be seen here.
Estimated Coefficients
The Gauss-Markov theorem demonstrates that (with some assumptions) the OLS estimations are the best linear unbiased estimators (BLUE) for the regression coefficients. The assumptions are:
- Linearity
- Random sampling
No perfect multicolinearity
The variances for each coefficient are:
Note that the standard deviation of the population's parameter is unknown, so it's estimated like:
If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually:
Wherein, for example, r1j is the residual from regressing x1 onto x2, ... xk.
The variances for each coefficient can be estimated with the Eicker-White formula:
See Nicolai Kuminoff's video lectures for the derivation of the robust estimators.
