exo-explore / exo

Run your own AI cluster at home with everyday devices 📱💻 🖥️⌚
GNU General Public License v3.0
6.56k stars 342 forks source link

[BOUNTY - $500] Distributed stable diffusion #159

Open AlexCheema opened 3 weeks ago

AlexCheema commented 3 weeks ago

Will require some core changes to how distributed inference works, hence higher bounty of $500. This would be a great contribution to exo.

pranav4501 commented 3 weeks ago

Hi @AlexCheema , Can I take this up?

AlexCheema commented 3 weeks ago

Hi @AlexCheema , Can I take this up?

Yes please! Added you to the sheet @pranav4501

AlexCheema commented 3 weeks ago

Btw you mentioned you only have one device to test. You can still run multiple nodes on a single device. The easiest way is to do something like

python3 main.py —listen-port 5678 —broadcast-port 5679 —chatgpt-api-port 8000 —node-id “node1” python3 main.py —listen-port 5679 —broadcast-port 5678 —chatgpt-api-port 8001 —node-id “node2”

This is a trick to make sure ports don’t conflict and the nodes can still discover each other.

you can also write tests which should be a faster way of iterating.

pranav4501 commented 3 weeks ago

Yes, this works. Thank you Alex.

pranav4501 commented 3 weeks ago

Hi @AlexCheema,

This is my understanding of the requirements and the model I am using. Please let me know of any changes to this.

AlexCheema commented 3 weeks ago

Hi @AlexCheema,

  • [ ] Stable diffusion (text to image) => Stable Diffusion v2-1
  • [ ] MLX distributed inference
  • [ ] Tinygrad distributed inference

This is my understanding of the requirements and the model I am using. Please let me know of any changes to this.

Looks good to me.

There's already examples for MLX and Tinygrad for stable diffusion v2. Example inference code for MLX: https://github.com/ml-explore/mlx-examples/tree/main/stable_diffusion Example inference code for Tinygrad: https://github.com/tinygrad/tinygrad/blob/master/examples/sdv2.py