C0untFloyd / roop-unleashed

Evolved Fork of roop with Web Server and lots of additions
GNU Affero General Public License v3.0
2.02k stars 464 forks source link

Add feature multiple gpu #823

Closed M4N0J-KUM4R closed 3 days ago

C0untFloyd commented 1 month ago

You mean selecting which GPU is being used or running multiple gpus at once in parallel?

zeeyannosse commented 1 month ago

hello! bravo for this exceptional tool ! > was wondering the same question > got 4 GPU a5000 and . it's using only one. ;))) > could it be possible to "activate" in the preference all of them ? like in a render farm kind of ? again... BRAVO ! and thank you! yann

andro23001 commented 1 month ago

You mean selecting which GPU is being used or running multiple gpus at once in parallel?

Kinda but i would like to have a option like backdoor process option in SwarmUI. In the extra tab if GPU we can queue both process running on both GPUs like a if else statement if first process is running on cuda execution provider 0 then run the 2nd process in the cuda execution provider 1, I know for now this is the only way to use multi gpu effectively.

C0untFloyd commented 1 month ago

@andro23001 is probably right, it's currently very hard to do parallel processing in python and even more so using cuda. I was hoping for something a lot more simple: just adding a setting which cuda device id to use like 0,1,2 or 3. You could then start and run roop 4 times and no instance would interfere with each other.

andro23001 commented 1 month ago

@andro23001 is probably right, it's currently very hard to do parallel processing in python and even more so using cuda. I was hoping for something a lot more simple: just adding a setting which cuda device id to use like 0,1,2 or 3. You could then start and run roop 4 times and no instance would interfere with each other.

I am new to programming with a moderate knowledge on python I'll also try to figure any solution, but I got an idea for running on multiple GPU using docker by layers , or even called multi stage builds. Using cuda layer and on top python layer and on top application layer. By the way sorry for my poor English.

zeeyannosse commented 1 month ago

thank you very much for your reply, this is promessing.

github-actions[bot] commented 4 days ago

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

C0untFloyd commented 3 days ago

Please test https://github.com/C0untFloyd/roop-unleashed/discussions/880