Consider a k-variables linear regression model, i.e.,
Y = X 1β1 + X 2 β2 + ε,
Where, X1 is (N . k1 ) , X 2 is (N . k2 ) and k = k1 + k2 . As you may recall, adding columns to
the X matrix (including additional regressors in the model) gives a positive definite increase in R2.
The adjusted R2 attempts to avoid this phenomenon of ever-increasing R2. Show that the
additional k2 number of variables (regressors) in this model increases R2 if the calculated F-statistic in testing the joint statistical significance of coefficients of these additional
regressors (β2 ) are larger than one.
The regression analysis is a statistical procedure that allow us to find the linear association between the dependent variable/ outcome variable and one or more independent variables /predictor variables. When there lies linear relationship between the variables, regression analysis will allow us to compute the value of outcome variable using the provided information of predictor variables.
The simple linear regression is of the following format
Y=a+bX
Here,
Y is the dependent variable
a is the intercept coefficient
b is the slope coefficient
x is the independent variable
The intercept of the regression equation is defined as the expected value of outcome variable when there is no value of the predictor variable and the slope of the regression equation is the expected increase or decrease in the dependent variable when there is an increment one unit in the independent variable.
The addition of new regression leads to the increase in the value of adjusted r-squared and it happens only if new regressor reduces the mean square error.
So, it could be said that it improves the overall fit of the model.
Comments
Leave a comment