shmsw25 / AmbigQA

An original implementation of EMNLP 2020, "AmbigQA: Answering Ambiguous Open-domain Questions"
https://arxiv.org/abs/2004.10645
117 stars 22 forks source link

Question About the Results #15

Closed NoviScl closed 3 years ago

NoviScl commented 4 years ago

Hi Sewon,

For the provided checkpoint named "DPR Reader trained on AmbigNQ (387M)", is it trained just on the AmbigNQ training set or NQ + AmbigNQ training set? I tried to train a DPR Reader with BERT-base span extractor on AmbigNQ training set alone and the result I got seems to be lower than the provided checkpoint.

Best, Chenglei

shmsw25 commented 4 years ago

Hi @NoviScl, yeah you are right, it is pretrained on NQ and then trained on AmbigNQ. Sorry that it wasn't super clear.