Open yongyongdown opened 4 years ago
Hello, thank you for your interesting research.
I've been studying to understand the attention technique featured in this paper, but I'm not sure how exactly it works.
I studied RNN, LSTM, ATTENTION, Self-Attention (Transformer), etc.
However, these are explained in terms of natural language processing.
I saw that the attention technique was used on the vision side, but I didn't understand in detail yet. How exactly does attention work in this paper?
If you let me know, it will be very helpful to me. Thank you.
Hello, thank you for your interesting research.
I've been studying to understand the attention technique featured in this paper, but I'm not sure how exactly it works.
I studied RNN, LSTM, ATTENTION, Self-Attention (Transformer), etc.
However, these are explained in terms of natural language processing.
I saw that the attention technique was used on the vision side, but I didn't understand in detail yet. How exactly does attention work in this paper?
If you let me know, it will be very helpful to me. Thank you.