YanCote / IFT6268-simclr

Project for IFT6268
0 stars 0 forks source link

Implement model checkpointing in Finetuning loop. #7

Open marued opened 4 years ago

marued commented 4 years ago

Look at the current for checkpoint support in the code database

marued commented 3 years ago

First version has been added to the master branch for Finetuning session. Commit is not linked to the ticket unfortunately.

marued commented 3 years ago

It will save every 'epoch_save_step' iterations. To load a checkpoint simply add the 'load_ckpt' variable to the finetuning yml.

sgaut023 commented 3 years ago

I'm working on loading the saved checkpoints and use them for fine-tuning.

sgaut023 commented 3 years ago

@marued I added a new function in run.py to create module hub from checkpoint.

In theory, we will never need to call this function because at the end of the training and eval, the hub module will be created.

However, it is useful to have this function when we want to create the hub module from a specific checkpoint.