philipperemy / keras-attention

Keras Attention Layer (Luong and Bahdanau scores).
Apache License 2.0
2.8k stars 677 forks source link

Visualizing attention weights with input arrays #42

Closed MichaelHopwood closed 4 years ago

MichaelHopwood commented 4 years ago

When predicting on test data with the trained model, how can I visualize the attention weights? I'd like to study where the model designates as "important areas".

For reference, my input data is usually of shape (100, 900, 4) with 3 output classification options.

Thanks!

philipperemy commented 4 years ago

You can refer to this example: https://github.com/philipperemy/keras-attention-mechanism/blob/47fc761276c0cd08ce5bb1068d93c52dac090569/examples/example-attention.py#L82

Here is the description of what is does: https://github.com/philipperemy/keras-attention-mechanism#adding-two-numbers

You should be able to adapt it with your input data and your model.