rainofmine / Face_Attention_Network

Pytorch implementation of face attention network
313 stars 55 forks source link

What is the attention technique applied to this paper? #14

Open yongyongdown opened 4 years ago

yongyongdown commented 4 years ago

Hello, thank you for your interesting research.

I've been studying to understand the attention technique featured in this paper, but I'm not sure how exactly it works.

I studied RNN, LSTM, ATTENTION, Self-Attention (Transformer), etc.

However, these are explained in terms of natural language processing.

I saw that the attention technique was used on the vision side, but I didn't understand in detail yet. How exactly does attention work in this paper?

If you let me know, it will be very helpful to me. Thank you.