Another eaiser-to-find example of what I'm describing is in the git hub repo for the book "Introduction to Machine Learning" (github.com/amuller/introduction_to_ml_with_python)
Look at: 02-supervised-learning.ipynb
Then, for instance, with Ridge Regression:
from sklearn.linear_model import Ridge
ridge = Ridge().fit(X_train, y_train)
print("Training set score: {:.2f}".format(ridge.score(X_train, y_train)))
print("Test set score: {:.2f}".format(ridge.score(X_test, y_test)))
==
How to X_train and y_train get populated?
The presentation is in snippets not complete ready-to-go code examples. This approach seems like a common theme in this and other books like this leaving it up to the reader to figure things out rather than complete code examples.
I want standalone examples that run so I can focus on using rather than spending time just getting them to work.
Perhaps I don't understand these notebooks....that's possible.
IAC, just wanted to post an easier-to-get-to example of what I was talking about.
Thanks,
Look at: 02-supervised-learning.ipynb
Then, for instance, with Ridge Regression:
from sklearn.linear_model import Ridge
ridge = Ridge().fit(X_train, y_train)
print("Training set score: {:.2f}".format(ridge.score(X_train, y_train)))
print("Test set score: {:.2f}".format(ridge.score(X_test, y_test)))
==
How to X_train and y_train get populated?
The presentation is in snippets not complete ready-to-go code examples. This approach seems like a common theme in this and other books like this leaving it up to the reader to figure things out rather than complete code examples.
I want standalone examples that run so I can focus on using rather than spending time just getting them to work.
Perhaps I don't understand these notebooks....that's possible.
IAC, just wanted to post an easier-to-get-to example of what I was talking about.
Thanks,