Size: 1660
Comment:
|
Size: 1653
Comment: Renamed attachments
|
Deletions are marked like this. | Additions are marked like this. |
Line 51: | Line 51: |
{{attachment:model3.svg}} | {{attachment:model2.svg}} |
Line 55: | Line 55: |
{{attachment:model4.svg}} | {{attachment:homo1.svg}} |
Line 57: | Line 57: |
Note also that the standard deviation of the population's parameter is unknown, so it's estimated like: | Note that the standard deviation of the population's parameter is unknown, so it's estimated like: |
Line 59: | Line 59: |
{{attachment:model6.svg}} | {{attachment:homo2.svg}} |
Ordinary Least Squares
Ordinary Least Squares (OLS) is a linear regression method. It minimizes root mean square errors.
Contents
Univariate
The regression line passes through two points:
and
These points, with the generic equation for a line, can prove that the slope of the regression line is equal to:
The generic formula for the regression line is:
Linear Model
The linear model can be expressed as:
If these assumptions can be made:
- Linearity
- Random sampling
- No perfect multicolinearity
Then OLS is the best linear unbiased estimator (BLUE) for these coefficients.
Using the computation above, the coefficients are estimated to produce:
The variances for each coefficient are:
Note that the standard deviation of the population's parameter is unknown, so it's estimated like:
If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually:
It follows that the variances for each coefficient are:
These variances can be estimated with the Eicker-White formula: