eugeneyan / eugeneyan-comments

1 stars 1 forks source link

https://eugeneyan.com/writing/attention/ #74

Open utterances-bot opened 1 year ago

utterances-bot commented 1 year ago

Some Intuition on Attention and the Transformer

What's the big deal, intuition on query-key-value vectors, multiple heads, multiple layers, and more.

https://eugeneyan.com/writing/attention/

nikam-shreyas commented 1 year ago

Wow, this was such an insightful dive into Transformers! I loved how you broke down the core concepts - your explanations really helped solidify my understanding.

The part about libraries and key-value pairs for queries was especially enlightening. I walked away feeling like I have a much stronger handle on how these models work now.

Thank you for putting this together! I would highly recommend your material to anyone looking to better understand Transformers. Keep up the great work!

eugeneyan commented 1 year ago

Wow thank you for the kind words! Your feedback encourages me to write more and help simplify such concepts 🙏