Closed rzumer closed 5 years ago
If you just want to fine-tuning or transfer learning, just change to another dataset is OK.
You can use --pth
to load a specific weight file. adam.pth
is variables for Adam optimizer (momentum, etc) delete it will reset the optimizer.
I would like to know if there is a simple way to retrain part of a network. For example, given a network trained for 200 epochs on some dataset, start from epoch 100 and train for another 100 epochs using a different dataset, different learning rate, etc. (overwriting checkpoints is OK).
Can this be achieved simply by deleting .pth files above the epoch from which I want to retrain? If so does the adam.pth file have to be modified? Or is there another solution?
Thanks.