Nested cross-validation for unbiased predictions. Can be used with Scikit-Learn, XGBoost, Keras and LightGBM, or any other estimator that implements the scikit-learn interface.
MIT License
62
stars
20
forks
source link
Add support for other ways of splitting data (Only KFold is supported) #7
It should be possible to input either a number (where KFold will be default, like it is now) or other ways of splitting the data, e.g. Leave-One-Out, Stratified KFold etc. (check here).
It should be possible to input either a number (where KFold will be default, like it is now) or other ways of splitting the data, e.g. Leave-One-Out, Stratified KFold etc. (check here).