kukrishna / genaudit

Apache License 2.0
13 stars 1 forks source link

Multiple GPUs for inference #1

Open INF800 opened 5 months ago

INF800 commented 5 months ago

Is there any method available out of the box to run FactChecker("hf:kundank/genaudit-usb-flanul2") inference with multiple GPUs? My 24gigs GPU goes OOM.

INF800 commented 5 months ago

Can we do distributed inference with multiple GPUs? i.e Loading parts of a model onto each GPU.

kukrishna commented 5 months ago

Yes, it is possible. We can split the model across more than one GPU if it doesn't fit in one. The current version of the library does not support it, but I will add that when I have more bandwidth.