cdqa-suite / cdQA

⛔ [NOT MAINTAINED] An End-To-End Closed Domain Question Answering System.
https://cdqa-suite.github.io/cdQA-website/
Apache License 2.0
614 stars 191 forks source link

Use different pre-trained weight #347

Open Charon922 opened 4 years ago

Charon922 commented 4 years ago

HI,

There are several questions I would like to ask:

  1. In the reader, the pretrained weight you used for BertForQuestionAnswering is 'bert-base-uncased'. Is it okay I used other pretrained weight like 'bert-large-uncased-whole-word-masking-finetuned-squad'? Will it help to enhance the result?

  2. When I used 'bert-large-uncased', it stated the error 'cuda out of memory'. Is there any solution for that? Will fp16=True help?

  3. I haven't trained the model on SQuAD2.0, will it help to increase the accuracy of response?

Thanks for your response in advance.