SeoSangwoo / Attention-Based-BiLSTM-relation-extraction

Tensorflow Implementation of "Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification" (ACL 2016)
http://www.aclweb.org/anthology/P16-2034
Apache License 2.0
430 stars 139 forks source link

Attention Weight of padding tokens? #19

Open vhientran opened 5 years ago

vhientran commented 5 years ago

Please let me know if this model architecture calculates attention weights for padding tokens or not.