harbecke / HexHex

AlphaGo Zero adaptation for Hex
GNU General Public License v3.0
25 stars 5 forks source link

add Bayesian Optimization for hyperparameter search #26

Closed harbecke closed 5 years ago

harbecke commented 5 years ago

We are still pretty much clueless about how optimal our parameters are. I am working on adding ax. Here is a list of possible parameters for optimization:

[TRAIN] batch_size learning_rate epochs weight_decay

[CREATE_DATA] train_samples_per_model temperature temperature_decay gamma

[REPEATED SELF TRAINING] num_data_models

harbecke commented 5 years ago
  1. define a more solid measurement (e.g. ELO with reference_models) :heavy_check_mark:
  2. define a better breaking point (time instead of RST iterations) :heavy_check_mark:
  3. get better understandable results (maybe change framework) :heavy_check_mark:
harbecke commented 5 years ago

TODO:

harbecke commented 5 years ago

added visualization notebook and intermediate saving callback saving best model is difficult and improbable to be highly beneficial