CompVis / taming-transformers

Taming Transformers for High-Resolution Image Synthesis
https://arxiv.org/abs/2012.09841
MIT License
5.56k stars 1.11k forks source link

How to use the sliding attention window mechanism? #244

Open RichardXue123 opened 3 months ago

RichardXue123 commented 3 months ago

I have trained a transformer model that can generate 256*256 images, how can I use the sliding attention window mechanism mentioned in the paper to generate high resolution images? It would be nice to have sample code!

RichardXue123 commented 3 months ago

note: unconditional

ZhangJinian commented 1 month ago

hi What operating system do you run the code on? windows or linux. I ran the code on winodws system but i encountered bugs that i cant fix