hp-l33 / AiM

Official PyTorch Implementation of "Scalable Autoregressive Image Generation with Mamba"
MIT License
108 stars 6 forks source link

Positional Encoding #4

Closed Bezdarnost closed 1 month ago

Bezdarnost commented 2 months ago

Thank you very much for your work!

Where can I find more details about your implementation of positional encoding in the model?

hp-l33 commented 2 months ago

Hi, the positional encoding can be found within GPT2Embeddings located in models/stage2/mixer_seq_simple.py.