nlpyang / PreSumm

code for EMNLP 2019 paper Text Summarization with Pretrained Encoders
MIT License
1.29k stars 465 forks source link

Pretrained Models #187

Open matt9704 opened 4 years ago

matt9704 commented 4 years ago

image Only these pretrained models are provided. What if I want to use BertAbs? python train.py -task abs -mode train -bert_data_path BERT_DATA_PATH -dec_dropout 0.2 -model_path MODEL_PATH -sep_optim true -lr_bert 0.002 -lr_dec 0.2 -save_checkpoint_steps 2000 -batch_size 140 -train_steps 200000 -report_every 50 -accum_count 5 -use_bert_emb true -use_interval true -warmup_steps_bert 20000 -warmup_steps_dec 10000 -max_pos 512 -visible_gpus 0,1,2,3 -log_file ../logs/abs_bert_cnndm What should the MODEL_PATH contain?

SebastianVeile commented 4 years ago

Model path is where you save each checkpoint. Currently you save a checkpoint at every 2000 step