hp-l33 / AiM

Official PyTorch Implementation of "Scalable Autoregressive Image Generation with Mamba"
MIT License
96 stars 6 forks source link

Positional Encoding #4

Open Bezdarnost opened 2 weeks ago

Bezdarnost commented 2 weeks ago

Thank you very much for your work!

Where can I find more details about your implementation of positional encoding in the model?

hp-l33 commented 2 weeks ago

Hi, the positional encoding can be found within GPT2Embeddings located in models/stage2/mixer_seq_simple.py.