Closed truedat101 closed 6 months ago
Thanks for your interest. We will take your suggestions into consideration and may update the multi-GPU version of the inference code in the future.
Oh, is it just this? CUDA_VISIBLE_DEVICES=0
Change 0 to 0,1,...
Currently only the first GPU is used by default
Current set up defaults to GPU 0, and will run out of memory on a single GPU <= 8GB VRAM. Ideally there is some configuration to support specifying the GPU to use, or a number of GPUs to use.