farizrahman4u / seq2seq

Sequence to Sequence Learning with Keras
GNU General Public License v2.0
3.17k stars 845 forks source link

any way to visualize attention activation? #272

Open cristianmtr opened 5 years ago

cristianmtr commented 5 years ago

Hey

Is there any way to get the attention weights? That would be super nice :)

cdhx commented 5 years ago

i want this too,hope somebody can give a demo

cdhx commented 5 years ago

or if someone can show where the attention is in the source code model/cell,i just find in model.py,the anatation are asa fllow in line232: The weight alpha[i, j] for each hj is computed as follows: energy = a(s(i-1), H(j)) alpha = softmax(energy) Where a is a feed forward network. and in cell.py line 90,there shows how alpha calculate

cdhx commented 5 years ago

anybody help?~