Closed wac81 closed 2 years ago
models/rl_model.py, line 44:
self.encoder = Bert(args.bert_dir, args.finetune_bert)
You could replace BERT with other pre-trained encoders, e.g., Roberta. However, seq2seq architectures like BART are not supported yet, because we designed our own decoder.
Yes.
RuntimeError: Given normalized_shape=[768], expected input with shape [*, 768], but got input of size[4, 4, 1024]
i try Roberta model,but got error , could you give me all modify suggestion for code or args?
it's done. i fixed all args according to the prompt