Closed utterances-bot closed 3 years ago
Hello, may I translate and republish this article to a Blog in China? I will credit the source and the author
Hi, I am sorry I did not see this comment until today. You may publish this article with due credits if you have not done already. Thank you.
Generalizing Attention in NLP and Understanding Self-Attention | Black Box ML
Generalizing the idea of attention in NLP and understanding various methods of calculating attention used in the literature so far. Also, understand and implement multiheaded self-attention using PyTorch.
https://kushalj001.github.io/black-box-ml/attention/bahdanau/self%20attention/bahdanau%20attention/multihead%20attention/pytorch-implemention/2020/07/06/Generalizing-Attention-in-NLP-and-Understanding-Self-Attention.html