d2l-ai / d2l-en

Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.
https://D2L.ai
Other
22.45k stars 4.19k forks source link

Transformer encoder -> Transformer decoder #2606

Open MassEast opened 2 weeks ago

MassEast commented 2 weeks ago

In section 11.9.3 Decoder-Only, it should say "GPT pretraining with a Transformer decoder" instead of "GPT pretraining with a Transformer encoder", just as depicted in Fig. 11.9.6

Description of changes:

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.