SeoSangwoo / Attention-Based-BiLSTM-relation-extraction

Tensorflow Implementation of "Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification" (ACL 2016)
http://www.aclweb.org/anthology/P16-2034
Apache License 2.0
430 stars 139 forks source link

Questions to ask #21

Open xxxxyan opened 5 years ago

xxxxyan commented 5 years ago

Hello,SeoSangwoo!Thank you for sharing such valuable code. I would like to ask you a question about the code,why did you use tf.reduce_sum here to reduce dimensions?is it the beat way? I would appreciate it if you could answer my questions.

output = tf.reduce_sum(inputs * tf.expand_dims(alphas, -1), 1)