Closed ad5014858 closed 3 months ago
Hi, the checkpoint size should stay pretty much the same size over multiple epochs as the same number of weights are stored in the checkpoint. The size of the checkpoint should only change if you add/remove layers from your model during training. However, the values stored inside the weights should change. You can verify this by loading two checkpoints and comparing the state_dict
in them.
Hi, the checkpoint size should stay pretty much the same size over multiple epochs as the same number of weights are stored in the checkpoint. The size of the checkpoint should only change if you add/remove layers from your model during training. However, the values stored inside the weights should change. You can verify this by loading two checkpoints and comparing the
state_dict
in them.
Thank you very much for your reply, it solved my doubts
Dear, I am a beginer,When training with a custom dataset and using the ModelCheckpoint library to save the ckpt generated after each training, it was found that the .ckpt file size was the same for each epoch.
The code is as follows:
I've searched many places and can't find the reason, please tell me why this is, thanks