AUTOMATIC1111 / stable-diffusion-webui

Stable Diffusion web UI
GNU Affero General Public License v3.0
136.45k stars 26k forks source link

Support for TPU #3032

Open Ljzd-PRO opened 1 year ago

Ljzd-PRO commented 1 year ago

Can it support Google TPU(like Google Colab)

ClashSAN commented 1 year ago

What do you mean? TPUs favor tensorflow, everything here is pytorch.

giteeeeee commented 1 year ago

I'm curious as well. If you get a tpu on Collab, then it's going to be slower than a rtx card of the same level?

Ljzd-PRO commented 1 year ago

I'm curious as well. If you get a tpu on Collab, then it's going to be slower than a rtx card of the same level?

It’s said that TPU is faster when doing inferences, but not training.

giteeeeee commented 1 year ago

Just tested on a local RTX2060 6G vs. on a collab T4 12G

The 2060 appears to be ~25% faster when doing text2image and image2image.

Training I can't test with a 2060 lol

Extraltodeus commented 1 year ago

TPU seems to be good at generating images in parralel. It would be very nice to have such compatibility.

iamianM commented 1 year ago

Is there any development on this?

swcrazyfan commented 1 year ago

Diffusers officially supports TPU, so I'm guessing it's not a complete rehaul to add it. However, since it's FLAX, I'm not sure exactly how it would be done.

RarogCmex commented 1 year ago

There is project https://github.com/magicknight/stable-diffusion-tpu , however, it seems a bit abandoned

upright2003 commented 1 year ago

I searched for some information, It seems to modify launch.py and webui.py

https://blog.richliu.com/2023/03/04/5109/stable-diffusion-webui-cpu-only-on-arm64-platform/ https://huggingface.co/docs/diffusers/using-diffusers/stable_diffusion_jax_how_to

ClashSAN commented 1 year ago

Omg. webui android? (edit: nvm) Wonder if tpu inference working on the tensorchip on pixel 6..

vsemecky commented 1 year ago

TPUs favor tensorflow, everything here is pytorch.

That's a misunderstanding. The "T" in TPU stands for "Tensor" not "TensorFlow". Both PyTorch and TensorFlow can use TPU under the hood. Look at https://colab.research.google.com/github/pytorch/xla/blob/master/contrib/colab/getting-started.ipynb

Except original SD, there is also Diffusers edition, which can work on TPU: https://huggingface.co/blog/stable_diffusion_jax

aeroxy commented 1 year ago

I can get it to run on TPU VM but it's very slow.

aeroxy commented 1 year ago

Can it support Google TPU(like Google Colab)

i looked into the source code it looks like it would take a massive effort to support TPU. First we need custom versions of torch, torch_xla, torchvision, and then we need to modify stable diffusion itself when calling torch APIs. TPU currently do not support all the APIs used in stable diffusion meaning we need to debug each single API.

NXTler commented 9 months ago

How would we even go about the memory, the coral TPU's don't even have one to begin with. However, it would be really cool if when there were support.

B0rner commented 9 months ago

I can get it to run on TPU VM but it's very slow. Can you share the code, how you would be able to get this running?

Was it slow because fo the low performance of the tpu or because the tpu wasn't use and the script runs on cpu?

i looked into the source code it looks like it would take a massive effort to support TPU. First we need custom versions of torch, >torch_xla, torchvision, and then we need to modify stable diffusion itself when calling torch APIs. TPU currently do not support >all the APIs used in stable diffusion meaning we need to debug each single API.

TPUs are currently the only way to give users usable access to tools like automatic1111, who are unable to upgrade a GPU. This applies to all laptops that do not have a dedicated GPU, for example. TPUs support would significantly increase the userbase.

Isn't it possible to use something like that with automatic1111: https://huggingface.co/blog/stable_diffusion_jax ??

ehamawy commented 2 months ago

It's definitely possible. Here's an example of someone getting SDXL running on Google's TPU v5e https://huggingface.co/blog/sdxl_jax