Differences between revisions 14 and 15
Revision 14 as of 2023-10-28 18:20:36
Size: 1696
Comment: Updating heteroskedasticity
Revision 15 as of 2023-10-28 18:25:46
Size: 1730
Comment:
Deletions are marked like this. Additions are marked like this.
Line 28: Line 28:

----



== Multivariate ==

Ordinary Least Squares

Ordinary Least Squares (OLS) is a linear regression method. It minimizes root mean square errors.


Univariate

The regression line passes through two points:

[ATTACH]

and

[ATTACH]

These points, with the generic equation for a line, can prove that the slope of the regression line is equal to:

[ATTACH]

The generic formula for the regression line is:

[ATTACH]


Multivariate


Linear Model

The linear model can be expressed as:

model1.svg

If these assumptions can be made:

  1. Linearity
  2. Exogeneity

  3. Random sampling
  4. No perfect multicolinearity
  5. Homoskedasticity

Then OLS is the best linear unbiased estimator (BLUE) for these coefficients.

Using the computation above, the coefficients are estimated to produce:

model2.svg

The variances for each coefficient are:

homo1.svg

Note that the standard deviation of the population's parameter is unknown, so it's estimated like:

homo2.svg

If the homoskedasticity assumption does not hold, then the estimators for each coefficient are actually:

hetero1.svg

Wherein, for example, r1j is the residual from regressing x1 onto x2, ... xk.

The variances for each coefficient can be estimated with the Eicker-White formula:

hetero2.svg


CategoryRicottone

Statistics/OrdinaryLeastSquares (last edited 2025-01-10 14:33:38 by DominicRicottone)