= Ordinary Least Squares = '''Ordinary Least Squares''' ('''OLS''') is a linear regression method, and is effectively synonymous with the '''linear regression model'''. <> ---- == Description == A linear model is expressed as either {{attachment:model.svg}} (univariate) or {{attachment:mmodel.svg}} (multivariate with ''k'' terms). Either way, a crucial assumption is that the expected value of the error term is 0, such that the [[Statistics/Moments|first moment]] is ''E[y,,i,,|x,,i,,] = α + βx,,i,,''. === Single Regression === In the case of a single predictor, the OLS regression is: {{attachment:estimate.svg}} This formulation leaves the components explicit: the y-intercept term is the mean outcome at ''x=0'', and the slope term is marginal change to the outcome per a unit change in ''x''. The derivation can be seen [[Statistics/OrdinaryLeastSquares/Single|here]]. === Multiple Regression === In the case of multiple predictors, the regression is fit like: {{attachment:mestimate.svg}} But conventionally, this OLS system is solved using [[LinearAlgebra|linear algebra]] as: {{attachment:matrix.svg}} Note that using a ''b'' here is [[Statistics/EconometricsNotation#Models|intentional]]. The derivation can be seen [[Statistics/OrdinaryLeastSquares/Multiple|here]]. ---- == Estimated Coefficients == The '''Gauss-Markov theorem''' demonstrates that (with some assumptions) the OLS estimations are the '''best linear unbiased estimators''' ('''BLUE''') for the regression coefficients. The assumptions are: 1. Linearity 2. Exogeneity, i.e. predictors are independent of the outcome and the error term 3. Random sampling 4. No perfect [[LinearAlgebra/Basis|multicolinearity]] 5. Homoskedasticity, i.e. error terms are constant across observations #5 mostly comes into the estimation of [[Statistics/StandardErrors|standard errors]], and there are alternative estimators that are robust to heteroskedasticity. ---- CategoryRicottone