AI Basics: When I send a Transformer-based LLM a “prompt”, what happens internally in more technical terms?
Please describe your issue
I'm unclear on what "Attention Mechanism" is. It seems to break the numbering, although the numbering continues at "6. Output Generation" below. Should it be "6. Attention Mechanism" and then "7. Output Generation"?
Describe the solution you'd like to see
Fix indentation/numbering? I'm unclear if "Attention Mechanism" is part of "Decoder" or not.
Sub-section title
AI Basics: When I send a Transformer-based LLM a “prompt”, what happens internally in more technical terms?
Please describe your issue
I'm unclear on what "Attention Mechanism" is. It seems to break the numbering, although the numbering continues at "6. Output Generation" below. Should it be "6. Attention Mechanism" and then "7. Output Generation"?
Describe the solution you'd like to see
Fix indentation/numbering? I'm unclear if "Attention Mechanism" is part of "Decoder" or not.