Closed DZY-cybe closed 2 years ago
It shouldn't. Are you trying to finetune a model?
Yes.I also want to restart a stopped training.
I think this is simply due to the pretrained model being in a folder with a different name. If needed, you can bypass Pytorch Lightning checkpointing mechanism and simply load the model with a load_state_dict
call.
Thanks for your codes!I have one question about restart a training.In the README.md, I seem to be able to use: python train.py --id train_ID --path_data path/to/waymo/training/dir --gpus 0.But when I use the pretrained model, it builds a new version.Is it right?