shyamupa / snli-entailment

attention model for entailment on SNLI corpus implemented in Tensorflow and Keras
178 stars 43 forks source link

Attention #1

Closed dandxy89 closed 8 years ago

dandxy89 commented 8 years ago

Would you be able to explain in the wiki the logic of how you made the attention mechanism in keras?

My email alternatively is Dan.dixey@gmail.com

shyamupa commented 8 years ago

Look at equations 7, 8, 9, 10 from the paper.

dandxy89 commented 8 years ago

Thanks