Closed M4N0J-KUM4R closed 3 days ago
hello! bravo for this exceptional tool ! > was wondering the same question > got 4 GPU a5000 and . it's using only one. ;))) > could it be possible to "activate" in the preference all of them ? like in a render farm kind of ? again... BRAVO ! and thank you! yann
You mean selecting which GPU is being used or running multiple gpus at once in parallel?
Kinda but i would like to have a option like backdoor process option in SwarmUI. In the extra tab if GPU we can queue both process running on both GPUs like a if else statement if first process is running on cuda execution provider 0 then run the 2nd process in the cuda execution provider 1, I know for now this is the only way to use multi gpu effectively.
@andro23001 is probably right, it's currently very hard to do parallel processing in python and even more so using cuda. I was hoping for something a lot more simple: just adding a setting which cuda device id to use like 0,1,2 or 3. You could then start and run roop 4 times and no instance would interfere with each other.
@andro23001 is probably right, it's currently very hard to do parallel processing in python and even more so using cuda. I was hoping for something a lot more simple: just adding a setting which cuda device id to use like 0,1,2 or 3. You could then start and run roop 4 times and no instance would interfere with each other.
I am new to programming with a moderate knowledge on python I'll also try to figure any solution, but I got an idea for running on multiple GPU using docker by layers , or even called multi stage builds. Using cuda layer and on top python layer and on top application layer. By the way sorry for my poor English.
thank you very much for your reply, this is promessing.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
You mean selecting which GPU is being used or running multiple gpus at once in parallel?