Open utterances-bot opened 1 year ago
Table of Contents Attention이란? Self-Attention Multi-Head Attention Transforemrs a. Encoder b. Decoder
https://jio0728.github.io/deep%20learning/Attention/
attentions의 작동원리를 직관적으로 잘 설명하신 것 같아요!! transformer의 encoder, decoder 파트도 기대하겠습니당!!
Attention 기본 개념 정리 - 지오의 논문 탐방
Table of Contents Attention이란? Self-Attention Multi-Head Attention Transforemrs a. Encoder b. Decoder
https://jio0728.github.io/deep%20learning/Attention/