Size: 1408
Comment:
|
Size: 1576
Comment: Heteroskedasticity
|
Deletions are marked like this. | Additions are marked like this. |
Line 53: | Line 53: |
The variance for each coefficient is estimated as: | The variances for each coefficient is estimated as: |
Line 56: | Line 56: |
Where R^2^ is calculated as: {{attachment:model5.svg}} |
|
Line 65: | Line 61: |
If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually: {{attachment:hetero1.svg}} And the variances for each coefficient are estimated as: {{attachment:hetero2.svg}} |
Ordinary Least Squares
Ordinary Least Squares (OLS) is a linear regression method. It minimizes root mean square errors.
Contents
Univariate
The regression line passes through two points:
and
These points, with the generic equation for a line, can prove that the slope of the regression line is equal to:
The generic formula for the regression line is:
Linear Model
The linear model can be expressed as:
If these assumptions can be made:
- Linearity
- Random sampling
- No perfect multicolinearity
Then OLS is the best linear unbiased estimator (BLUE) for these coefficients.
Using the computation above, the coefficients are estimated to produce:
The variances for each coefficient is estimated as:
Note also that the standard deviation of the population's parameter is unknown, so it's estimated like:
If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually:
And the variances for each coefficient are estimated as: