Closed bbudescu closed 4 months ago
Sorry for so many questions over such little time. I'm trying to decide which optimization package to use for a project, and, for that, I'm checking the support for the features I need / would like.
In principle, OpenBox is designed to be a black-box optimization service. For multi-fidelity optimization that treats the problem as "grey-box", we provide algorithms implemented based on OpenBox basic components, including Hyperband, BOHB, MFES-HB, ASHA, etc. Please refer to openbox/apps/multi_fidelity/
for more details. For more advanced multi-fidelity algorithm that aims at large-scale parallel optimization and combines ASHA, MFES and other strategies, you can have a try on HyperTune: https://github.com/PKU-DAIR/HyperTune .
The resource dimension is not simply treated as another hyperparamter in our design. In MFES-HB and HyperTune, an ensemble model is fitted with observations (evaluation results) of different resource levels.
Cross-task transfer learning with multi-fidelity is not provided as an open source part in OpenBox at present. To support this feature, you may need to make some modifications to the existing code.
For early stopping, user may implement the strategy during optimization via the ask-and-tell interface. We will soon update the API of Optimizer
to support user-defined early stoping functions.
E.g., is the resource dimension (e.g., the number of epochs or the number of samples to train) treated as just another hyperparameter? How can one do cross-task transfer learning with multi-fidelity, e.g., how does one report xval accuracy at every epoch for previous trials?