Closed mikeDTI closed 4 years ago
Then just re-edit for the AUC tuning option as well.
Managed to replicate this issue also on the test data available in examples/
by commenting out the rest of the algorithms to compete against in training before I tried running the tuning script.
Thanks to @mikeDTI's suggestion (with some minor tweaks) this issue has been resolved in the discrete/supervised/tuning.py
script by modifying the lines in the tuning script so that the eta0 error doesn't pop up - both maximizing on AUC and balance accuracy have been tested. The most recent push reflects these changes.
Tuning on AUC:
Tuning on balanced accuracy:
You are a certified bug killer @m-makarious !
Please make sure that this is a bug.
System information:
Describe the current behavior: SGDClassifier gets eta < 0 bug at tune
ValueError: eta0 must be > 0
Describe the expected behavior: eta0 must be > 0
Code to reproduce the issue: See DementiaSeqML.
Other Information / Logs Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
Archive.zip
To fix this, just need to make eta0 limits at tune if SGDC is the winning algorithm.
Might be safe to change ...
elif best_algo == 'SGDClassifier': hyperparameters = {'alpha': [1e-4, 1e-3, 1e-2, 1e-1, 1e0, 1e1, 1e2, 1e3], "learning_rate": ["constant", "optimal", "invscaling", "adaptive"]} scoring_metric = metrics.make_scorer(metrics.balanced_accuracy_score, needs_proba=False)
To ...elif best_algo == 'SGDClassifier': hyperparameters = { 'alpha': [1e-4, 1e-3, 1e-2, 1e-1, 1e0, 1e1, 1e2, 1e3], # learning rate 'n_iter': [1000], # number of epochs 'loss': ['log'], # logistic regression, 'penalty': ['l2'], 'n_jobs': [-1] } scoring_metric = metrics.make_scorer(metrics.balanced_accuracy_score, needs_proba=False)