Closed goord closed 12 months ago
I don't think so.
Btw, we used the hyperopt
library but since it hasn't been maintained in a while (I see it started having new commits in the last few months, but it has been stale since ~2021) maybe it makes sense to implement some other library...
Back when hyperopt
was implemented I also implemented optuna and it worked fine but chose to stay with hyperopt
because I have a great ability to select the winner technology :)
(for what we were doing hyperopt was much simpler to use and required less tweaking, in any case, what I'm trying to say is feel free to change hyperopt to some other library, the hyperoptimization library is not coupled at all with the rest of the code so it should be easy)
Here's a patch with my old optuna implementation patch_optuna.zip
the changes needed were really minimal (I'm guessing the patch itself won't work since it is 4 years old) so using other hyperoptimization libraries should also be possible!
Does the hyperoptimization of the fit currently support checkpointing and restarting the bayesian optimization process?
As I mentioned in the slack, this feature is not available. But if picking as a starting point a given n
-th trial is needed, then this can be easily implemented by loading the json
file that are already written into disk and use it as the starting input to the trials
entry.
Closed by #1824
Does the hyperoptimization of the fit currently support checkpointing and restarting the bayesian optimization process?