nlpyang / BertSum

Code for paper Fine-tune BERT for Extractive Summarization
Apache License 2.0
1.46k stars 422 forks source link

How to continue training from previous checkpoints? #118

Closed namln2k closed 2 years ago

namln2k commented 2 years ago

Hi guys I'm new to nlp and summarization in particular. I'm running the BERT+RNN model on Google Colab. I understand that the param "-save_checkpoint_steps 1000" in the training script helps save the model every 1000 train steps. However I don't know how to continue from that checkpoint (All I know is to run the training script: "python train.py ....." and each time, it starts from the very beginning). In detail, I have mounted my google drive to the notebook and the checkpoints are saved as in the below image. Can someone tell me how to do that please (Is there a param I should add to the script to make it work?). Many thanks in advance! image image

Anothernewcomer commented 2 years ago

use the paramater args.train_from

namln2k commented 2 years ago

use the paramater args.train_from

Yeah I just found out this afternoon :v Thanks anw! image

Anothernewcomer commented 2 years ago

use the paramater args.train_from

Yeah I just found out this afternoon :v Thanks anw! image

I want to use this model but I'm new to this and not sure about whether my loss trend is correct, can we talk?

namln2k commented 2 years ago

use the paramater args.train_from

Yeah I just found out this afternoon :v Thanks anw! image

I want to use this model but I'm new to this and not sure about whether my loss trend is correct, can we talk?

I'm a newbie too, man. I'm just attempting to complete a short course at my school and I'm currently not so interested in this field.