heidmic / suprb-experimentation

GNU General Public License v3.0
2 stars 1 forks source link

Refactor experiment framework #6

Closed Saethox closed 2 years ago

Saethox commented 2 years ago

Experiment framework

Closes #3

The old idea of nested constructor calls was abandoned in favor of this much more flexible "builder-like" syntax. Here's a simple example, taken from runs/rf.py:

# Create the base experiment, using some default tuner
experiment = Experiment(tuner=default_tuner, verbose=10)
# Add global tuning of the `n_estimators` parameter. 
# It is tuned by itself first, and afterwards, the fixed value is propagated to nested experiments,
# because `propagate` is not set.
experiment.with_tuning({'n_estimators': Integer(1, 200)})

# Create a nested experiment using the MAE.
mae_experiment = experiment.with_params({'criterion': 'absolute_error'})
# Tune only this experiment on some parameter. Note that a different tuner is used here.
mae_experiment.with_tuning({'bootstrap': Categorical([True, False])}, tuner=extensive_tuner)

# Create a nested experiment using the MSE.
mse_experiment = experiment.with_params({'criterion': 'squared_error'})

# Add global tuning of the `max_depth` parameter. Because `propagate` is set here,
# the value is tuned new for every nested experiment.
experiment.with_tuning({'max_depth': Integer(1, 5)}, propagate=True)

# Evaluation using cross-validation and an external test set
evaluation = CrossValidateTest(estimator=estimator, X_train=X_train, y_train=y_train, X_test=X_test, y_test=y_test,
                               random_state=random_state, verbose=10)

experiment.perform(evaluation, cv=8, n_jobs=4)

Hyperparameter Tuning

Closes #1

Tuning using skopt is now possible. Other frameworks like hyperopt, sklearn-genetic-opt, ~optuna~ or ray.tune would be interesting, as well, especially because skopt doesn't support nested search spaces.

Edit: Added support for optuna, which supports nested search spaces.

Problems

Closes #5

Several test functions and ~five~ seven datasets from the UCI Repository were added and are exposed similar to built-in scikit-learn datasets.

heidmic commented 2 years ago

While reviewing the code, I found no issues or things I would want to have changed. However, I did not get to try it out yet. Maybe then some more points arise. What would be great would be a usage and more importantly an extension guide (although I do realize the latter can be quite complex). I think the examples help with the former already but as soon as we want to change some details, I can get hard to assess interdependencies quite easily. More out of interest than critique: Is there a particular advantage with Optuna? What does it do that skopt does not? What does it do better? Just interested in your though process. If I saw it correctly it has CMA-ES sampling which is always nice

Saethox commented 2 years ago

Is there a particular advantage with Optuna? What does it do that skopt does not? What does it do better?

Optuna supports nested search spaces, i.e. tuning parameters that only arise if some other parameter is set to a certain value. I figured that evaluating all 14 individual_optimizer versions, like I did in my thesis, would be too much. So I'm also letting the tuner decide what subversion of metaheuristic to use, e.g., Bitwise or DimensionFlips ABCA. But these components can have different parameters, and that's what skopt doesn't support.

RomanSraj commented 2 years ago

How do you execute any of the runs (e.g. runs/minimal.py) locally? If I try to it gets stuck at "/home/gl-455/suprb2/suprb2/individual/mixing_model.py", line 14, in ErrorExperienceHeuristic. Same for the other ones in the runs folder

Saethox commented 2 years ago

How do you execute any of the runs (e.g. runs/minimal.py) locally? If I try to it gets stuck at "/home/gl-455/suprb2/suprb2/individual/mixing_model.py", line 14, in ErrorExperienceHeuristic. Same for the other ones in the runs folder

Can you give the exact error message? Is your installation of suprb2 and suprb2opt up to date? Because we don't use versioning, you have to manually install new versions from master using:

pip install --upgrade -e git+https://github.com/heidmic/suprb2@master#egg=suprb2

Can you execute the examples in the original suprb2 repo?

Saethox commented 2 years ago

Did you try it again yet, @RomanSraj, with the updated version (https://github.com/heidmic/suprb2-experimentation/pull/6#issuecomment-1012085944)?

RomanSraj commented 2 years ago

Did you try it again yet, @RomanSraj, with the updated version (https://github.com/heidmic/suprb2-experimentation/pull/6#issuecomment-1012085944)?

Yes, sorry. Totally forgot to update. I was missing access to suprbopt. It works fine now