-
HI!
first of all thank u
i see u r supporting fp16 too, my question is will u convert the main model to fp16 too? for lower size
does it support cpu ?
and could u please make it compatible w…
-
this would work on mac/win/linux and easier to test
jtoy updated
11 months ago
-
Screenshot of the error:
![image](https://github.com/rupeshs/fastsdcpu/assets/131554379/b21895bd-9fee-4b6f-89dc-36fd2a8eb94f)
How can i fix it?
-
First of all batch generation would be very cool to be able to do.
Is there a way that you can implement any form of upscaling here ? Not too complicated just basic 2x or 4x upscaling using any mod…
-
Hi!
can someone please tell me how can i convert fp32 diffuser models [the ones that have unet vae tokenizer etc.. folders that are used for stable diffusion] to fp16?
i can't find any tutorial or…
-
Hi,
i have low CPU core, but i have big GPU card, how do i use this only for GPU card? my gpu rtx 4090
thanks adv.
-
Could You make a Google Colab Link?
-
Is it able to run under the CPU?It would be nice if it could be optimized to run under the CPU, I think stable diffusion has some that can run under the CPU, and if it maybe it would be great
-
anything-v5 + latent-consistency/lcm-lora-sdxl:
```
***** Init LCM-LoRA pipeline - stablediffusionapi/anything-v5 *****
Loading pipeline components...: 14%|████████████████████████▍ …
-
Hello!
thank u for this!
i'm a huge fan of onnx as it is very cpu friendly. could u please try and make AnimateDiff to work with onnx too? it will be great to have text 2 gif and image 2 gif with An…