wooyeolBaek / attention-map

🚀 Cross attention map tools for huggingface/diffusers
https://huggingface.co/spaces/We-Want-GPU/diffusers-cross-attention-map-SDXL-t2i
MIT License
150 stars 9 forks source link

OUT OF MEMORY #5

Open chenbinghui1 opened 5 months ago

chenbinghui1 commented 5 months ago

When I replace the attn processor with this project, I find the used GPU memory will be increased. Who knows the reason?

wooyeolBaek commented 5 months ago

Have you tried torch.cuda.empty_cache() ?