Open tomvannoord63 opened 1 year ago
Created new branch optuna to work on integrating optuna into the project.
TODO: Add in Tensorboard callback for optuna optimization and also add in SQL storage for the results.
Potentially add in a custom callback to implement pruning?
TODO: Add in Optuna method to determine the most influential hyperparameters for the given implementation.
Initial hyperparameter optimization is implemented on branch main/optuna. The current method creates a database that contains all the saved trials of optimization so that it can continue. Currently the optimizer only works with three hyperparameters: n_epochs, gamma and total_timesteps. These are hardcoded and cannot be changed unless the user goes in and changes them. The end goal is to have a way of letting these be changed without having to change the code. Perhaps a file that can be changed and read in.
Implement automatic hyperparameter tuning (probably using the Optuna optimization framework). This is a core feature of this project yet to be implemented.