RowitZou / topic-dialog-summ

AAAI-2021 paper: Topic-Oriented Spoken Dialogue Summarization for Customer Service with Saliency-Aware Topic Modeling.
MIT License
77 stars 9 forks source link

how to use args pretrain #28

Closed wac81 closed 2 years ago

wac81 commented 2 years ago

I found the Pretrain option invalid during RL training? Or continue the finetune basic model Bert?

wac81 commented 2 years ago

is that true

RowitZou commented 2 years ago

Pretrain means the separate training process of encoder and decoder based on the basic model BERT.

wac81 commented 2 years ago
# Pretrain args
parser.add_argument("-pretrain", type=str2bool, nargs='?', const=True, default=False)

my question is if you set default=True with RL training , found out base model BERT can be TRAIN too! is that true?

RowitZou commented 2 years ago

if pretrain is set True, it just goes into the separate training, not RL. if pretrain is set False, it means RL training. In both settings, BERT is finetuned as well.