Closed Saethox closed 2 years ago
While reviewing the code, I found no issues or things I would want to have changed. However, I did not get to try it out yet. Maybe then some more points arise. What would be great would be a usage and more importantly an extension guide (although I do realize the latter can be quite complex). I think the examples help with the former already but as soon as we want to change some details, I can get hard to assess interdependencies quite easily. More out of interest than critique: Is there a particular advantage with Optuna? What does it do that skopt does not? What does it do better? Just interested in your though process. If I saw it correctly it has CMA-ES sampling which is always nice
Is there a particular advantage with Optuna? What does it do that skopt does not? What does it do better?
Optuna supports nested search spaces, i.e. tuning parameters that only arise if some other parameter is set to a certain value. I figured that evaluating all 14 individual_optimizer
versions, like I did in my thesis, would be too much. So I'm also letting the tuner decide what subversion of metaheuristic to use, e.g., Bitwise
or DimensionFlips
ABCA
. But these components can have different parameters, and that's what skopt
doesn't support.
How do you execute any of the runs
(e.g. runs/minimal.py
) locally? If I try to it gets stuck at "/home/gl-455/suprb2/suprb2/individual/mixing_model.py", line 14, in ErrorExperienceHeuristic
. Same for the other ones in the runs
folder
How do you execute any of the
runs
(e.g.runs/minimal.py
) locally? If I try to it gets stuck at"/home/gl-455/suprb2/suprb2/individual/mixing_model.py", line 14, in ErrorExperienceHeuristic
. Same for the other ones in theruns
folder
Can you give the exact error message? Is your installation of suprb2
and suprb2opt
up to date? Because we don't use versioning, you have to manually install new versions from master using:
pip install --upgrade -e git+https://github.com/heidmic/suprb2@master#egg=suprb2
Can you execute the examples in the original suprb2 repo?
Did you try it again yet, @RomanSraj, with the updated version (https://github.com/heidmic/suprb2-experimentation/pull/6#issuecomment-1012085944)?
Did you try it again yet, @RomanSraj, with the updated version (https://github.com/heidmic/suprb2-experimentation/pull/6#issuecomment-1012085944)?
Yes, sorry. Totally forgot to update. I was missing access to suprbopt. It works fine now
Experiment framework
Closes #3
The old idea of nested constructor calls was abandoned in favor of this much more flexible "builder-like" syntax. Here's a simple example, taken from
runs/rf.py
:Hyperparameter Tuning
Closes #1
Tuning using
skopt
is now possible. Other frameworks likehyperopt
,sklearn-genetic-opt
, ~optuna
~ orray.tune
would be interesting, as well, especially becauseskopt
doesn't support nested search spaces.Edit: Added support for
optuna
, which supports nested search spaces.Problems
Closes #5
Several test functions and ~five~ seven datasets from the UCI Repository were added and are exposed similar to built-in scikit-learn datasets.