-
Is it possible to run distilled model (Block-removed Knowledge-distilled Stable Diffusion, https://github.com/Nota-NetsPresso/BK-SDM).
When I tried to some distilled model. I got error message simi…
-
According to my own understanding, these checkpoints are distilled SD models that generates with Dynamic, is this right? Moreover, can you provide more training details of such models? Thanks a lot in…
-
Thank U for opening source code. But I still got a few questions as follows.
1. When I checked the the saved distilled images, I found that you make it into the vis.pdf, a CSS sprite, encompassed a …
-
Is it possible to run distilled model (Block-removed Knowledge-distilled Stable Diffusion, https://github.com/Nota-NetsPresso/BK-SDM).
When I tried to some distilled model. I got error message simi…
-
I am so curious what a video will look like when distilled.
-
https://github.com/ggerganov/whisper.cpp/pull/1424
It'd be nice to give this a shot.
-
### System Info
This is with Transformerrs.js V2
### Environment/Platform
- [X] Website/web-app
- [ ] Browser extension
- [ ] Server-side (e.g., Node.js, Deno, Bun)
- [ ] Desktop app (e.g., Electro…
-
ValueError: Cannot load from /ai/dit_dis/transformer because the following keys are missing:
time_extra_emb.style_embedder.weight.
Please make sure to pass `low_cpu_mem_usage=False` and `devi…
-
### Have you checked that your issue isn't already filed?
- [X] I read through [FAQ](https://github.com/alshedivat/al-folio/blob/master/FAQ.md) and searched through the [past issues](https://github…
-
sorry to bother, I see the distill_loss in distill.py as :
distill_loss = F.kl_div(
F.log_softmax(distill_logits / T, dim=-1),
F.softmax(teacher_logits / T, dim=…