issues
search
kijai
/
ComfyUI-KwaiKolorsWrapper
Diffusers wrapper to run Kwai-Kolors model
Apache License 2.0
400
stars
16
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
建筑物的细节都是糊的
#29
slash130
opened
1 hour ago
1
Support from ComfyUI's extra_model_paths.yaml
#28
petercham
opened
8 hours ago
0
运行ComfyUI-KwaiKolorsWrapper时出现这个问题,请教该如何解决
#27
elisha7366
opened
4 days ago
0
Error, the picture is black
#26
huameiwei-vc
opened
6 days ago
0
ValueError: Cannot load <class 'diffusers.models.unets.unet_2d_condition.UNet2DConditionModel'>
#25
caoshouling
opened
1 week ago
0
fp32不能用吗
#24
xiaoyin199
opened
1 week ago
1
Trying to make it work with SwamUI
#23
Michoko92
opened
1 week ago
0
Library cublasLt is not initialized
#22
c4dee
closed
1 week ago
1
ChatGLM Exception during processing
#21
LankyPoet
closed
1 week ago
2
Error occurred when executing KolorsTextEncode: Torch not compiled with CUDA enabled
#20
foggyghost0
closed
1 week ago
2
这个节点还能用吗?快手不是删除了 diffusion_pytorch_model.fp16.bin
#19
hotdogarea
opened
1 week ago
0
[Bug] :bug: After adding this plugin, Yoloworld_ESAM_Zho began to generate errors : failed in the TorchScript interpreter.RuntimeError: invalid vector subscript
#18
Erwin11
opened
1 week ago
2
(DOWN) Is the effect effect used by the model that loads chatglm3 the same as the model that loads chatglm3?
#17
alenhrp
opened
1 week ago
2
Error occurred when executing LoadChatGLM3
#16
HEITAOKAKA
opened
1 week ago
1
Could ChatGLM3 just run by CPU?
#15
snow2zhou
opened
1 week ago
2
Error occurred when executing DownloadAndLoadKolorsModel:
#14
lesteryebin
opened
1 week ago
2
Error occurred when executing KolorsTextEncode: "softmax_lastdim_kernel_impl" not implemented for 'Half'
#13
roninDday
opened
1 week ago
0
On Ubuntu Linux - Error occurred when executing DownloadAndLoadKolorsModel
#12
SouthbayJay
opened
1 week ago
0
Error occurred when executing DownloadAndLoadKolorsModel:
#11
wtl196544579
opened
1 week ago
1
I can run Kolors on a quant4 setup, but loading the chatglm3-4bit.safetensors model separately gives an error!
#10
klossm
opened
1 week ago
0
MacOS issue with the Load ChatGLM3 Model node
#9
jwooldridge234
closed
1 week ago
3
Multiple Files missing.
#8
DavidSnow1
opened
1 week ago
34
not yet support lora,ControlNet,ipadapter?
#7
ioritree
opened
1 week ago
1
why "Loading checkpoint shards:" take me so long time
#6
xueqing0622
closed
1 week ago
5
Error occurred when executing DownloadAndLoadKolorsModel:
#5
wangxq1997
opened
1 week ago
1
The loaded bin model has been removed, please update
#4
alex13by
closed
1 week ago
0
Add pyproject.toml for Custom Node Registry
#3
haohaocreates
closed
1 week ago
0
Add Github Action for Publishing to Comfy Registry
#2
haohaocreates
closed
1 week ago
0
newest updated error
#1
BNP1111
closed
1 week ago
3