Closed Trainingzy closed 1 year ago
Thanks for the reply. The model can work with xformers 0.0.17
in my situation now.
The problem is the VRAM. On 4090/A5000, I can only run the --fast
version, which requires about 18GB. On V100, the normal version can be used, taking up about 28GB.
What is the difference between the normal and fast version?
You can refer to this line. The fast mode doesn't use classifier-free guidance.
I will temporally close this issue. You are welcome to reopen it if you still have this problem.
Thanks for your great work.
I cannot run the script
run_videop2p.py
on either 4090 or A5000. I have tried three versions of xformers.xformers 0.0.15.dev0+0bad001.d20230429
will lead to this error:xformers 0.0.16
andxformers 0.0.17
will lead toout-of-memory
error.Also, I have tried both pytorch 1.12.1 and 1.13.1. Neither of them work.
May I know the xformers version you use?