LoSealL / VideoSuperResolution

A collection of state-of-the-art video or single-image super-resolution architectures, reimplemented in tensorflow.
MIT License
1.62k stars 296 forks source link

Partial retraining #85

Closed rzumer closed 5 years ago

rzumer commented 5 years ago

I would like to know if there is a simple way to retrain part of a network. For example, given a network trained for 200 epochs on some dataset, start from epoch 100 and train for another 100 epochs using a different dataset, different learning rate, etc. (overwriting checkpoints is OK).

Can this be achieved simply by deleting .pth files above the epoch from which I want to retrain? If so does the adam.pth file have to be modified? Or is there another solution?

Thanks.

LoSealL commented 5 years ago

If you just want to fine-tuning or transfer learning, just change to another dataset is OK. You can use --pth to load a specific weight file. adam.pth is variables for Adam optimizer (momentum, etc) delete it will reset the optimizer.