In the reader, the pretrained weight you used for BertForQuestionAnswering is 'bert-base-uncased'. Is it okay I used other pretrained weight like 'bert-large-uncased-whole-word-masking-finetuned-squad'? Will it help to enhance the result?
When I used 'bert-large-uncased', it stated the error 'cuda out of memory'. Is there any solution for that? Will fp16=True help?
I haven't trained the model on SQuAD2.0, will it help to increase the accuracy of response?
HI,
There are several questions I would like to ask:
In the reader, the pretrained weight you used for BertForQuestionAnswering is 'bert-base-uncased'. Is it okay I used other pretrained weight like 'bert-large-uncased-whole-word-masking-finetuned-squad'? Will it help to enhance the result?
When I used 'bert-large-uncased', it stated the error 'cuda out of memory'. Is there any solution for that? Will fp16=True help?
I haven't trained the model on SQuAD2.0, will it help to increase the accuracy of response?
Thanks for your response in advance.