jerryji1993 / DNABERT

DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
https://doi.org/10.1093/bioinformatics/btab083
Apache License 2.0
578 stars 156 forks source link

There is a bug about attention mask in source code #102

Open Jason941 opened 1 year ago

Jason941 commented 1 year ago

Here is the location file:/src/transformer/modeling_utils.py

192 exteneded_attention_mask = extended_attention_mask.to(dtype=self.dtype) 193 exteneded_attention_mask = (1.0 - extended_attention_mask) * -10000.0 194 return extended_attention_mask

The returned variable name is wrong!!!