thebes2 / RL

1 stars 0 forks source link

Add persistent episode #14

Open thebes2 opened 2 years ago

thebes2 commented 2 years ago

Store the epoch number in the checkpoints. Restoring checkpoints currently restarts the training at epoch 0, which will not work with lr schedulers.

thebes2 commented 2 years ago

Can replace the existing flag by simply checking if the current epoch is 0.