Doubiiu / ToonCrafter

[SIGGRAPH Asia 2024, Journal Track] ToonCrafter: Generative Cartoon Interpolation
https://doubiiu.github.io/projects/ToonCrafter/
Apache License 2.0
5.3k stars 440 forks source link

Multi GPU support #20

Open D3lik opened 5 months ago

D3lik commented 5 months ago

Hi. I have a desktop that has 2x Tesla T4s, and it should be working because it has 32G VRAM in total, while other people reported to have a 27G VRAM usage when inferring. It should work but when inferring, only 1 gpu has being used, which caused a cuda out of memory error, what parts of code should i edit so that it can work on multiple GPUs? Thanks in advance

supraxylon commented 5 months ago

What error messages are you getting? Seems as though a similar issue was discussed here in another project: https://github.com/ultralytics/ultralytics/issues/1971

Try adjusting the batch size

D3lik commented 5 months ago

Adjusting batch size doesn't work for me, I think i need to use dataparallel in pytorch to work

khawar-islam commented 5 months ago

@D3lik I am running on single RTX 4090 but it gives an error CUDA out of memory, do I need two gpus?

D3lik commented 5 months ago

@D3lik I am running on single RTX 4090 but it gives an error CUDA out of memory, do I need two gpus?

No. Please check issue 2 for solutions.