Bottom Page

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
 Difference between R^2 and .score
#1
Hi guys,

As far as I know you can calculate the accuracy of a regression model with sklearn LinearRegression
The code would be something like this :

model = LinearRegression()
accuracy = model.score(X_test, y_test)

and print(accuracy) would give you a value for how good the model performs in predicting y.
Then there also exists r2_score from sklearn metrics.
Is there a difference between those two values ? If so, what is the difference ?
thanks in advance!
Quote
#2
r2_score:
R^2 (coefficient of determination) regression score function.

Best possible score is 1.0 and it can be negative (because the model can be arbitrarily worse). A constant model that always predicts the expected value of y, disregarding the input features, would get a R^2 score of 0.0.

Read more in the User Guide.

Linear Regression score:
Returns the coefficient of determination R^2 of the prediction.

The coefficient R^2 is defined as (1 - u/v), where u is the residual sum of squares ((y_true - y_pred) ** 2).sum() and v is the total sum of squares ((y_true - y_true.mean()) ** 2).sum(). The best possible score is 1.0 and it can be negative (because the model can be arbitrarily worse). A constant model that always predicts the expected value of y, disregarding the input features, would get a R^2 score of 0.0.

These descriptions look similar to me, though r2_score may be the square of score. I'd run both and compare.
Quote

Top Page

Forum Jump:


Users browsing this thread: 1 Guest(s)