In statistics, ordinary least squares (OLS) is a type of linear least squares for estimating unknown parameters in a linear regression model. OLS selects the parameters of the linear function of a set of explanatory variables in a least squares manner: by minimizing the sum of the squares of the differences between the observed dependent variable (s) in a given dataset and those predicted by the linear function.
Geometrically, this is considered the sum of the squared distances, parallel to the dependent variable axis, between each data point in the dataset and the corresponding point on the regression surface — the smaller the difference, the better the model fits the data. ... The resulting estimator can be expressed by a simple formula, especially in the case of simple linear regression, in which there is a single regressor on the right side of the regression equation.
The OLS estimate is consistent when the regressors are exogenous and, according to the Gauss-Markov theorem, is optimal in the class of linear unbiased estimates when the errors are homoscedastic and uncorrelated in order. Under these conditions, the OLS method provides an unbiased estimate with minimum variance when the errors have finite variance. Under the additional assumption that the errors are normally distributed, the OLS is a maximum likelihood estimate.
Female student GPA is 1.22 higher than male student's predicted GPA
Comments
Leave a comment