Size: 1860
Comment: Rewrite 2
|
Size: 2060
Comment: Multivariate
|
Deletions are marked like this. | Additions are marked like this. |
Line 38: | Line 38: |
More conventionally, this is estimated with [[LinearAlgebra|linear algebra]] as: {{attachment:matrix.svg}} The proof can be seen [[Econometrics/OrdinaryLeastSquares/MultivariateProof|here]]. |
Ordinary Least Squares
Ordinary Least Squares (OLS) is a linear regression method. It minimizes root mean square errors.
Univariate
Given one independent variable and one dependent (outcome) variable, the OLS model is specified as:
It is estimated as:
This model describes (1) the mean observation and (2) the marginal changes to the outcome per unit changes in the independent variable.
The proof can be seen here.
Multivariate
Given k independent variables, the OLS model is specified as:
It is estimated as:
More conventionally, this is estimated with linear algebra as:
The proof can be seen here.
Estimated Coefficients
If these assumptions can be made:
- Linearity
- Random sampling
- No perfect multicolinearity
Then OLS is the best linear unbiased estimator (BLUE) for regression coefficients.
The variances for each coefficient are:
Note that the standard deviation of the population's parameter is unknown, so it's estimated like:
If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually:
Wherein, for example, r1j is the residual from regressing x1 onto x2, ... xk.
The variances for each coefficient can be estimated with the Eicker-White formula:
See Nicolai Kuminoff's video lectures for the derivation of the robust estimators.