Closed Bezdarnost closed 1 month ago
Thank you very much for your work!
Where can I find more details about your implementation of positional encoding in the model?
Hi, the positional encoding can be found within GPT2Embeddings located in models/stage2/mixer_seq_simple.py.
GPT2Embeddings
models/stage2/mixer_seq_simple.py
Thank you very much for your work!
Where can I find more details about your implementation of positional encoding in the model?