bigscience-workshop / petals

🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
https://petals.dev
MIT License
8.89k stars 489 forks source link

Support stable diffusion model #519

Closed lbgws2 closed 9 months ago

lbgws2 commented 9 months ago

can i use stable diffusion model with petals?

borzunov commented 9 months ago

Hi @lbgws2,

StableDiffusion usually fits into one GPU, so we recommend to use StableHorde instead (it's another volunteer computing project focused on smaller models).

We don't plan to support StableDiffusion since Petals is focused on very large models that don't fit into one consumer GPU (so we have to use pipeline parallelism).

lbgws2 commented 9 months ago

What a surprise!! thank you very much @borzunov