Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.
In section 11.9.3 Decoder-Only, it should say "GPT pretraining with a Transformer decoder" instead of "GPT pretraining with a Transformer encoder", just as depicted in Fig. 11.9.6
Description of changes:
By submitting this pull request, I confirm that you can use, modify,
copy, and redistribute this contribution, under the terms of your
choice.
In section 11.9.3 Decoder-Only, it should say "GPT pretraining with a Transformer decoder" instead of "GPT pretraining with a Transformer encoder", just as depicted in Fig. 11.9.6
Description of changes:
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.