kushalj001 / black-box-ml

My blog about important ML/NLP concepts
https://kushalj001.github.io/black-box-ml/
Apache License 2.0
0 stars 0 forks source link

Generalizing Attention in NLP and Understanding Self-Attention | Black Box ML #5

Closed utterances-bot closed 3 years ago

utterances-bot commented 3 years ago

Generalizing Attention in NLP and Understanding Self-Attention | Black Box ML

Generalizing the idea of attention in NLP and understanding various methods of calculating attention used in the literature so far. Also, understand and implement multiheaded self-attention using PyTorch.

https://kushalj001.github.io/black-box-ml/attention/bahdanau/self%20attention/bahdanau%20attention/multihead%20attention/pytorch-implemention/2020/07/06/Generalizing-Attention-in-NLP-and-Understanding-Self-Attention.html

hk975 commented 3 years ago

Hello, may I translate and republish this article to a Blog in China? I will credit the source and the author

kushalj001 commented 3 years ago

Hi, I am sorry I did not see this comment until today. You may publish this article with due credits if you have not done already. Thank you.