Open coollofty opened 1 year ago
AFAIK: Not a bug. It doesn't... At least, I haven't been able to get it to use multi-GPU, even with Accelerate.
Can anyone confirm?
Not a Bug? maybe....
But why memory not enough? each card have 16GB memory, and at least 10GB memory not used when error occured.
I think the only reasonable explanation is that some codes do not implement deviceid settings and still use the default card, so the memory outted...
Is there an existing issue for this?
What happened?
I had 4 A4000 16GB GPU, use openjourney model (about 2GB)
I tried setting CUDA_VISIBLE_DEVICES and --device-id (in COMMANDLINE_ARGS or append to %PYTHON% launcher.py) or use CUDA_VISIBLE_DEVICES alone.
There is no problem during startup. When webui.bat print "Running on local URL: http://0.0.0.0:7861", in the task management window it can be seen that the memory of the four GPUs has been allocated. Since the same model is used, the memory occupied by the four GPUs is the same, which is 3.4GB
Then I launched 4 pages in my browser and visited 127.0.0.1:786[0-3] respectively, 4 pages display normally. Then I typed a prompt and pressed the generate button on all four pages as quickly as possible.
Steps to reproduce the problem
open 4 command window, and put them: 1: set CUDA_VISIBLE_DEVICES=0 set COMMANDLINE_ARGS=--device-id 0 webui.bat 2: set CUDA_VISIBLE_DEVICES=1 set COMMANDLINE_ARGS=--device-id 1 webui.bat 3:..........
What should have happened?
Only one can operate normally, and the other three will report the same error, and each time the three error command window are different
Commit where the problem happens
master
What platforms do you use to access the UI ?
Windows
What browsers do you use to access the UI ?
Google Chrome
Command Line Arguments
List of extensions
DreamBoth, OpenPose, ControlNet
Console logs
Additional information
Just now, I tried. The phenomenon of using only 2 GPUs is the same as using only 3 GPUs. Only one of them can always operate normally
No response