shmsw25 / AmbigQA

An original implementation of EMNLP 2020, "AmbigQA: Answering Ambiguous Open-domain Questions"
https://arxiv.org/abs/2004.10645
117 stars 22 forks source link

In BART generation, attention_mask is not aligned with input_id #8

Closed Yifan-Gao closed 4 years ago

Yifan-Gao commented 4 years ago

https://github.com/shmsw25/AmbigQA/blob/74f9ff85da81bd2925fca2ddb6ea59437335631b/codes/Data.py#L201-L217

After line 207, I think the attention_mask should also be cropped as attention_mask[idx] = curr_attention_mask[:end_of_question]. Otherwise, it will not be aligned with input_ids.

shmsw25 commented 4 years ago

You are right, thanks for catching it! Fixed it now.

Yifan-Gao commented 4 years ago

thanks!