The squared coefficient of determination is a measure of how well the explanatory variables predict the explained variable. It is the proportion of variability in the dependent variable that is accounted for by the linear relationship with the independent variables. It provides a measure of how well future outcomes are likely to be predicted by the model.

There are several different definitions of *R*^{2} which are only sometimes equivalent. One class of such cases includes that of linear regression. In this case, if an intercept is included then *R*^{2} is simply the square of the sample correlation coefficient between the outcomes and their predicted values, or in the case of simple linear regression, between the outcomes and the values of the single regressor being used for prediction. In such cases, the coefficient of determination ranges from 0 to 1. Important cases where the computational definition of *R*^{2} can yield negative values, depending on the definition used, arise where the predictions which are being compared to the corresponding outcomes have not been derived from a model-fitting procedure using those data, and where linear regression is conducted without including an intercept. Additionally, negative values of *R*^{2} may occur when fitting non-linear trends to data.^{[1]} In these instances, the mean of the data provides a fit to the data that is superior to that of the trend under this goodness of fit analysis.

## NotesEdit

- ↑ Cameron, A.C., Windmeijer, F.A.G., (1997)."An R-squared measure of goodness of fit for some common nonlinear regression models." Journal of Econometrics, Volume 77, Issue 2, April 1997, Pages 329-342.