THUDM / CogVideo

Text-to-video generation: CogVideoX (2024) and CogVideo (ICLR 2023)
Apache License 2.0
7.54k stars 697 forks source link

Out of memory on a 16gb AMD 6900 XT with Cogvideox i2v. #315

Open GoDJr opened 2 days ago

GoDJr commented 2 days ago

I can’t get cogvideox i2v to work I keep running out of memory with my 16gb amd. But ppl are reporting it working on a 3060 as well as even 6gb cards. Not sure what I’m doing wrong I’m already using q3 t5 to try and cut down on memory usage. also is there a new invite link to the discord the one on the main github is expired

zRzRzRzRzRzRzR commented 2 days ago

The 3060 can run normally because NVIDIA's device supports this feature

pipe.enable_sequential_cpu_offload()

As far as I know, AMD devices are not supported, which is due to deeper reasons, likely related to PyTorch or more core algorithms. This is something we cannot intervene in. If this optimization is not enabled, the GPU memory used would be 26GB, instead of the current 5GB.