Closed wac81 closed 2 years ago
is that true
Pretrain means the separate training process of encoder and decoder based on the basic model BERT.
# Pretrain args
parser.add_argument("-pretrain", type=str2bool, nargs='?', const=True, default=False)
my question is if you set default=True with RL training , found out base model BERT can be TRAIN too! is that true?
if pretrain is set True, it just goes into the separate training, not RL. if pretrain is set False, it means RL training. In both settings, BERT is finetuned as well.
I found the Pretrain option invalid during RL training? Or continue the finetune basic model Bert?