issues
search
segmind
/
distill-sd
Segmind Distilled diffusion
https://discord.gg/p2MdJqZXnb
Other
550
stars
36
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
[rank0]: AssertionError: You can't use same `Accelerator()` instance with multiple models when using DeepSpeed
#20
foreverpiano
opened
5 days ago
1
poor result
#19
leya516
opened
9 months ago
0
How to infer using the trained model?
#18
littletomatodonkey
closed
10 months ago
1
Can I use refine and lora?
#17
pendekarcode
opened
10 months ago
0
fix typo
#16
Shubhamkashyap1601
closed
8 months ago
0
How much gpu ram needed to train and inference?
#15
universewill
opened
11 months ago
0
Fixes AttributeError accessing modules in unet: "unet.<up, mid, down>_blocks"
#14
shreyas269
closed
1 year ago
0
Fix typo in "distill_training.py"
#13
shreyas269
closed
1 year ago
0
Getting broadcast error when trying to run distill_training.py
#12
shreyas269
closed
1 year ago
2
Encountered a very confusing issue while resuming from checkpoint.
#11
DragonDRLI
opened
1 year ago
4
distill_training error
#10
youngwanLEE
closed
1 year ago
4
multi-gpu training
#9
youngwanLEE
closed
1 year ago
0
Discord link is expired
#8
jjohare
closed
1 year ago
1
ControlNet?
#7
oxysoft
opened
1 year ago
2
Update README.md
#6
eltociear
opened
1 year ago
0
distillation on img2img or inpainting model
#5
jinwonkim93
closed
1 year ago
2
Running this in automatic1111?
#4
johari3275
opened
1 year ago
1
Hi, any plan on stablediffusion XL version?
#3
lucasjinreal
opened
1 year ago
7
Thank You, and Typo Correction Request for Hugging Face Blog
#2
bokyeong1015
opened
1 year ago
1
Can this be used for SDXL?
#1
ninjasaid2k
closed
1 year ago
0