Open adivekar-utexas opened 1 year ago
BTW, I have successfully used Repeater to do K-Fold as of yesterday. So it's definitely possible.
@adivekar-utexas Could you share the implementation?
I've managed to get K-fold cross validation working with TorchTrainer
and the BasicVariantGenerator
. The approach I took involves using the constant_grid_search=True
parameter of BasicVariantGenerator
.
...
search_space = {
'kfold': tune.grid_search([1, 2, 3, 4, 5]), # search over k-fold with 5 folds
# ... other hyper parameters
}
tuner = tune.Tuner(
trainer, # instance of TorchTrainer
param_space={'train_loop_config': search_space},
tune_config = tune.TuneConfig(
scheduler=scheduler,
search_alg=BasicVariantGenerator(
constant_grid_search=True, # Required for kfolds
),
)
)
...
In the train_loop_per_worker
function of trainer
I handle the actual cross-validation logic. But this at least ensures that the hyperparameters remain constant over each fold and that each fold can be executed in parallel. Not sure of how much help this is to anyone else, as it is fairly implementation-specific.
BTW, I have successfully used Repeater to do K-Fold as of yesterday. So it's definitely possible.
@adivekar-utexas ,
Would you please share how you got Repeater
working with BasicVariantGenerator
? Thank you!
Description
From this discussion: https://discuss.ray.io/t/raytune-use-repeater-with-basicvariantgenerator/9042
Currently, you can't use a BasicVariantGenerator with a Repeater.
Use case
This should be supported, so that something like this works:
Repeater is one suggested way to implement K-fold crossvalidation in Ray. K-fold CV is a common ask in Ray Tune: