Python Forum

Full Version: Are there any techniques for improving logloss?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I use logloss as a quality criterion, but it has a number of disadvantages:
1. Has infinite errors with confident misclassification.
2. Has a slight gradient in the "border" zone, where recognition is especially important.

Are there any modifications to this feature to improve the performance of machine learning algorithms?