sherpa-ai / sherpa

Hyperparameter optimization that enables researchers to experiment, visualize, and scale quickly.
http://parameter-sherpa.readthedocs.io/
GNU General Public License v3.0
333 stars 54 forks source link

Support for new algorithms #29

Closed anirudhacharya closed 5 years ago

anirudhacharya commented 5 years ago
LarsHH commented 5 years ago

Asynchronous Successive Halving is implemented ( https://github.com/sherpa-ai/sherpa/blob/8d6029307d27a2cef953c0ca36f197f14d321815/sherpa/algorithms/successive_halving.py#L27 ) which according to https://arxiv.org/pdf/1810.05934.pdf outperforms Hyperband. ASHA can actually be used to do Hyperband by iterating over different settings, but I'll leave that to future work for now.

LarsHH commented 5 years ago

Looks like https://github.com/DEAP could be a good starting point for the particle swarm optimization

LarsHH commented 5 years ago

Converted to cards in projects: https://github.com/sherpa-ai/sherpa/projects/1#card-28466175 https://github.com/sherpa-ai/sherpa/projects/1#card-28466208