Open jikkuatwork opened 2 days ago
Thank you for your advice! In my tests on an A100 GPU, the VRAM usage is approximately 26.38GB. However, you can reduce it to around 17GB by using the offloading method with the following code after loading the pipeline:
pipe.enable_model_cpu_offload()
Thanks a lot for posting it! Will wait to test till it can run on poor GPU (12GB) ... Best wishes guys!
You can also test it using the Hugging Face online demo, available here.
Sadly too much traffic; wait time it over 2 hours :(
Then use my template on Runpod.io. https://runpod.io/console/deploy?template=8pqb3dg7lq&ref=2pdhmpu1 1 video generate time: 8 min
Amazing work guys, but why most projects totally skip the hardware requirements needed to run? Isn't it the most important information for a project that can be run locally?