vdobrovolskii / wl-coref

This repository contains the code for EMNLP-2021 paper "Word-Level Coreference Resolution"
MIT License
104 stars 37 forks source link

continue training from checkpoint? #35

Closed Zoeyyao27 closed 2 years ago

Zoeyyao27 commented 2 years ago

Is there a way to continue training from the last model

vdobrovolskii commented 2 years ago

To actually continue, you would need the saved optimizer states, which were removed to make the weights file smaller. But it should be possible to initialize the weights of the model using the pretrained weights and then start the training

Zoeyyao27 commented 2 years ago

To actually continue, you would need the saved optimizer states, which were removed to make the weights file smaller. But it should be possible to initialize the weights of the model using the pretrained weights and then start the training

Thank you! I actually managed to continue from saved checkpoint. But according to your code in https://github.com/vdobrovolskii/wl-coref/blob/96b3d5e50367b253cafa81ea82ae39e44ca7fa46/coref/coref_model.py#L266 I assume the optimizer states have already been saved?

vdobrovolskii commented 2 years ago

They were originally, but then they were stripped from the state dict by a separate script to save space

Zoeyyao27 commented 2 years ago

Ha! I see. Thank you for your kind reply! Great work by the way! :)