Plug-and-play hydra sweepers for the EA-based multifidelity method DEHB and several population-based training variations, all proven to efficiently tune RL hyperparameters.
Apache License 2.0
70
stars
12
forks
source link
Allow Seeding the ConfigSpace and Optimization Algorithm (Improvement) #19
Hi!
for reproducibility, it would be good to allow seeding of the config space used for the DEHB or PBT sweeper. I believe this was originally supposed to be supported since the function https://github.com/facebookresearch/how-to-autorl/blob/da9ecf831f3357d54efaf9f06e04cc34d0b5a9f4/hydra_plugins/utils/search_space_encoding.py#L35 allows seeding. However, in e.g. https://github.com/facebookresearch/how-to-autorl/blob/da9ecf831f3357d54efaf9f06e04cc34d0b5a9f4/hydra_plugins/hydra_dehb_sweeper/dehb_sweeper_backend.py#L141 a seed is never passed.
Additionally, I added seeding the numpy random generator. For DEHB this is necessary to seed the differential evolution backend.
See PR #21