Open radna0 opened 6 days ago
It's good Idea. Now we have to do manual work to install Intel OneAPI, for windows VIsual Studio, set correct version pytorch... Exhausted. I am not sure that now it use shared GPU, where I have Intel UHD and Nvidia
I'm experimenting to see if I can add support for TPU/XLA devices within the comfy code myself. If possible, I can try to open a PR to add support for it.
I don't know if there are configurations of TPUs with GPUs? As for shared TPUs, the most common way I see is with SPMD or FSDPv2. I believe you can also do that with XMP? but I haven't had much luck with it using TPUv2-8 or TPUv3-8.
Torch XLA: https://github.com/pytorch/xla Guide: https://github.com/pytorch/xla/blob/master/API_GUIDE.md SPMD: https://pytorch.org/xla/master/spmd.html FSDPv2:https://github.com/pytorch/xla/issues/6379
hi @gabriel-montrose, I have just created a PR. https://github.com/comfyanonymous/ComfyUI/pull/5657
Feature Idea
Support For TPU/XLA Devices
Existing Solutions
No response
Other
No response