comfyanonymous / ComfyUI

The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
https://www.comfy.org/
GNU General Public License v3.0
57.47k stars 6.09k forks source link

Support For TPU/XLA Devices #5635

Open radna0 opened 6 days ago

radna0 commented 6 days ago

Feature Idea

Support For TPU/XLA Devices

Existing Solutions

No response

Other

No response

gabriel-montrose commented 5 days ago

It's good Idea. Now we have to do manual work to install Intel OneAPI, for windows VIsual Studio, set correct version pytorch... Exhausted. I am not sure that now it use shared GPU, where I have Intel UHD and Nvidia

radna0 commented 5 days ago

I'm experimenting to see if I can add support for TPU/XLA devices within the comfy code myself. If possible, I can try to open a PR to add support for it.

I don't know if there are configurations of TPUs with GPUs? As for shared TPUs, the most common way I see is with SPMD or FSDPv2. I believe you can also do that with XMP? but I haven't had much luck with it using TPUv2-8 or TPUv3-8.

Torch XLA: https://github.com/pytorch/xla Guide: https://github.com/pytorch/xla/blob/master/API_GUIDE.md SPMD: https://pytorch.org/xla/master/spmd.html FSDPv2:https://github.com/pytorch/xla/issues/6379

radna0 commented 4 days ago

hi @gabriel-montrose, I have just created a PR. https://github.com/comfyanonymous/ComfyUI/pull/5657