Sshanu / Relation-Classification-using-Bidirectional-LSTM-Tree

TensorFlow Implementation of the paper "End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures" and "Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths" for classifying relations
MIT License
185 stars 41 forks source link

Can add attention mechanism in the code? #7

Closed dyf0631 closed 6 years ago

Sshanu commented 6 years ago

Yup, in the model using LSTM Networks along Shortest Dependency Paths, you can add an attention layer after bidirectional lstm layer on the hidden states of the words in the lca. In the model with lstm tree, you can try adding attention after the sequential lstm layer.

dyf0631 commented 6 years ago

thank you!