Open ghost opened 5 years ago
Hello Sandeep, version_2_with_negative implies that some of the examples do not have answers. So, it considers blank as answer for some questions.
Try removing "--version_2_with_negative=True" parameter or set it to False. This should work. We had similar issue and setting this to False fixed it.
Hi Thanks for reply. However answer is there in training that is why it gave answer with 1.1
It is expected to give blank answer when there is no answer found in doc, but it gave blank for everything
On Thu, 8 Aug, 2019, 4:24 PM Mittal Patel, notifications@github.com wrote:
Hello Sandeep, version_2_with_negative implies that some of the examples do not have answers. Try removing "--version_2_with_negative=True" parameter or set it to False. This should work. We had similar issue and setting this to False fixed it.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/google-research/bert/issues/436?email_source=notifications&email_token=AHRBKIG6BOJH26UEVTXGZWLQDP3MXA5CNFSM4GXMUJL2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD33HWEY#issuecomment-519469843, or mute the thread https://github.com/notifications/unsubscribe-auth/AHRBKIAK4FU7KJTGOGQNEF3QDP3MXANCNFSM4GXMUJLQ .
Having same issue, I replaced the start and end token indexes by -1 when there is no answer in the context, I understand that the embedding layer wouldn't accept that, but I can't find a solution to how I make it work
I trained BERT for SQUAD 2.0. model.ckpt-10859 checkpoints generated which I mentioned as initial checkpoint for predictions.
However, now for any question the answer is blank. For same questions BERT 1.1 trained version is giving answers. What can go wrong.?
Below is sample output:
And below is commands I ran for pretraining: