google-research / bert

TensorFlow code and pre-trained models for BERT
https://arxiv.org/abs/1810.04805
Apache License 2.0
38.05k stars 9.59k forks source link

Is there a plan to release code for fine-tuning on CoQA dataset? #597

Open saurabh-tripathi opened 5 years ago

saurabh-tripathi commented 5 years ago

Basically, for SQUAD dataset, the model returns part of the context as the answer. However, consider for the below context:

Jessica went to sit in her rocking chair. Today was her birthday and she was turning 80. Her granddaughter Annie was coming over in the afternoon and Jessica was very excited to see her. Her daughter Melanie and Melanie’s husband Josh were coming as well. 

For Question: How many people are visiting? The answer should be Three. Three is not part of the context but the correct answer in the given case. Is there any way to fine-tune BERT in order to give that type of answer? The above example is from CoQA dataset.

wyxcc commented 5 years ago

The same confusion with you.Have you got any idea?

sriyavasudevan commented 2 years ago

Same question, is there any update?