Python Forum
Difference between R^2 and .score - Printable Version

+- Python Forum (https://python-forum.io)
+-- Forum: Python Coding (https://python-forum.io/forum-7.html)
+--- Forum: Data Science (https://python-forum.io/forum-44.html)
+--- Thread: Difference between R^2 and .score (/thread-23610.html)



Difference between R^2 and .score - donnertrud - Jan-08-2020

Hi guys,

As far as I know you can calculate the accuracy of a regression model with sklearn LinearRegression
The code would be something like this :

model = LinearRegression()
accuracy = model.score(X_test, y_test)

and print(accuracy) would give you a value for how good the model performs in predicting y.
Then there also exists r2_score from sklearn metrics.
Is there a difference between those two values ? If so, what is the difference ?
thanks in advance!


RE: Difference between R^2 and .score - jefsummers - Jan-08-2020

r2_score:
R^2 (coefficient of determination) regression score function.

Best possible score is 1.0 and it can be negative (because the model can be arbitrarily worse). A constant model that always predicts the expected value of y, disregarding the input features, would get a R^2 score of 0.0.

Read more in the User Guide.

Linear Regression score:
Returns the coefficient of determination R^2 of the prediction.

The coefficient R^2 is defined as (1 - u/v), where u is the residual sum of squares ((y_true - y_pred) ** 2).sum() and v is the total sum of squares ((y_true - y_true.mean()) ** 2).sum(). The best possible score is 1.0 and it can be negative (because the model can be arbitrarily worse). A constant model that always predicts the expected value of y, disregarding the input features, would get a R^2 score of 0.0.

These descriptions look similar to me, though r2_score may be the square of score. I'd run both and compare.