deep-floyd / IF

Other
7.64k stars 497 forks source link

Can I distribute the stages over multiple GPUs? Like you see below #50

Open phalexo opened 1 year ago

phalexo commented 1 year ago

from deepfloyd_if.modules import IFStageI, IFStageII, StableStageIII from deepfloyd_if.modules.t5 import T5Embedder

I have 4 Maxwel Titan X with 12GB VRAM each.

if_I = IFStageI('IF-I-XL-v1.0', device='cuda:0') if_II = IFStageII('IF-II-L-v1.0', device='cuda:1') if_III = StableStageIII('stable-diffusion-x4-upscaler', device='cuda:2') t5 = T5Embedder(device="cuda:3")

brycedrennan commented 1 year ago

did you try it?

phalexo commented 1 year ago

No, to try it I have to provide my info to download the checkpoints. If the above is not possible with the current code, then I don't want to register for the checkpoints.

ThioJoe commented 1 year ago

Seems to work.

Before:

image image

After:

image image
phalexo commented 1 year ago

Seems to work.

Can I assume, you were also able to get an image out of it? Since they are using "accelerate" library already, I hoped it would work.

Thanks for the info.

ThioJoe commented 1 year ago

Yes I can now confirm I was able to get images out of it with multi GPUs.

Also you can change t5 = T5Embedder(device="cpu") to be a GPU like t5 = T5Embedder(device="cuda:0")

phalexo commented 1 year ago

Yes I can now confirm I was able to get images out of it with multi GPUs.

Also you can change t5 = T5Embedder(device="cpu") to be a GPU like t5 = T5Embedder(device="cuda:0")

Did you do anything additional? I am running into an issue that cuBLAS is not supported when t5 is called. I have installed cuda toolset 11.8 after 11.3 and it still gives me the same error.

Did the images display correctly?

Can you put up some info about your environment? Python module versions, kernel, etc...

Thanks.

larme commented 1 year ago

Hi all, we made an integration of BentoML with IF. It support starting local server with a simple way to assign stages to arbitrary GPU available to the system. You can also build docker image and deploy it to production environment using bentoctl or BentoCloud.