tomvannoord63 / RL

Repository of Reinforcement Learning Projects and Notebooks
0 stars 0 forks source link

Automatic Hyperparameter Optimization #2

Open tomvannoord63 opened 1 year ago

tomvannoord63 commented 1 year ago

Implement automatic hyperparameter tuning (probably using the Optuna optimization framework). This is a core feature of this project yet to be implemented.

tomvannoord63 commented 1 year ago

Created new branch optuna to work on integrating optuna into the project.

tomvannoord63 commented 1 year ago

TODO: Add in Tensorboard callback for optuna optimization and also add in SQL storage for the results.

Potentially add in a custom callback to implement pruning?

tomvannoord63 commented 1 year ago

TODO: Add in Optuna method to determine the most influential hyperparameters for the given implementation.

tomvannoord63 commented 1 year ago

Initial hyperparameter optimization is implemented on branch main/optuna. The current method creates a database that contains all the saved trials of optimization so that it can continue. Currently the optimizer only works with three hyperparameters: n_epochs, gamma and total_timesteps. These are hardcoded and cannot be changed unless the user goes in and changes them. The end goal is to have a way of letting these be changed without having to change the code. Perhaps a file that can be changed and read in.