Bottom Page

• 0 Vote(s) - 0 Average
• 1
• 2
• 3
• 4
• 5
 Difference between R^2 and .score donnertrud Silly Frenchman Posts: 21 Threads: 14 Joined: Dec 2019 Reputation: 0 Likes received: 0 #1 Jan-08-2020, 04:23 PM Hi guys, As far as I know you can calculate the accuracy of a regression model with sklearn LinearRegression The code would be something like this : model = LinearRegression() accuracy = model.score(X_test, y_test) and print(accuracy) would give you a value for how good the model performs in predicting y. Then there also exists r2_score from sklearn metrics. Is there a difference between those two values ? If so, what is the difference ? thanks in advance! jefsummers Verb Conjugator   Posts: 688 Threads: 1 Joined: May 2019 Reputation: 67 Likes received: 94 #2 Jan-08-2020, 05:14 PM r2_score: R^2 (coefficient of determination) regression score function. Best possible score is 1.0 and it can be negative (because the model can be arbitrarily worse). A constant model that always predicts the expected value of y, disregarding the input features, would get a R^2 score of 0.0. Read more in the User Guide. Linear Regression score: Returns the coefficient of determination R^2 of the prediction. The coefficient R^2 is defined as (1 - u/v), where u is the residual sum of squares ((y_true - y_pred) ** 2).sum() and v is the total sum of squares ((y_true - y_true.mean()) ** 2).sum(). The best possible score is 1.0 and it can be negative (because the model can be arbitrarily worse). A constant model that always predicts the expected value of y, disregarding the input features, would get a R^2 score of 0.0. These descriptions look similar to me, though r2_score may be the square of score. I'd run both and compare. « Next Oldest | Next Newest »

Top Page

Forum Jump:

Users browsing this thread: 1 Guest(s)