Closed MichaelHopwood closed 4 years ago
You can refer to this example: https://github.com/philipperemy/keras-attention-mechanism/blob/47fc761276c0cd08ce5bb1068d93c52dac090569/examples/example-attention.py#L82
Here is the description of what is does: https://github.com/philipperemy/keras-attention-mechanism#adding-two-numbers
You should be able to adapt it with your input data and your model.
When predicting on test data with the trained model, how can I visualize the attention weights? I'd like to study where the model designates as "important areas".
For reference, my input data is usually of shape (100, 900, 4) with 3 output classification options.
Thanks!