mozilla / ai-guide

Mozilla AI Guide
Mozilla Public License 2.0
31 stars 27 forks source link

AI Basics inconsistent numbering for "Attention Mechanism" #14

Open pdehaan opened 1 year ago

pdehaan commented 1 year ago

Sub-section title

AI Basics: When I send a Transformer-based LLM a “prompt”, what happens internally in more technical terms?

Please describe your issue

I'm unclear on what "Attention Mechanism" is. It seems to break the numbering, although the numbering continues at "6. Output Generation" below. Should it be "6. Attention Mechanism" and then "7. Output Generation"?

Describe the solution you'd like to see

Fix indentation/numbering? I'm unclear if "Attention Mechanism" is part of "Decoder" or not.