Open zpengcom opened 4 months ago
you downloaded all the necessary files?
ultrapixel_t2i.safetensors stage_a.safetensors previewer.safetensors effnet_encoder.safetensors stage_b_lite_bf16.safetensors stage_c_bf16.safetensors controlnet/canny.safetensors
您下载了所有必要的文件吗?
ultrapixel_t2i.safetensors stage_a.safetensors previewer.safetensors effnet_encoder.safetensors stage_b_lite_bf16.safetensors stage_c_bf16.safetensors controlnet/canny.safetensors
似乎是首次运行需要从 cdn-lfs.huggingface.co 下载models--laion--CLIP-ViT-bigG-14-laion2B-39B-b160k的 pytorch_model-00001-of-00002.bin 和 pytorch_model-00002-of-00002.bin 依赖模型 然而文件巨大,下载过程并没有进度条、以为卡住没动。。
you can manually download the models here:
https://github.com/catcathh/UltraPixel/blob/main/models/models_checklist.txt
I downloaded the model manually and put it in:ComfyUI_windows_portable\ComfyUI\models\ultrapixel
but
When I run the workflow, I get stuck at Downloading shards: 0%| | 0/2 [00:00<?, ?it/s]#
Same error here... stuck @ "Downloading shards: 0%| | 0/2 [00:00<?, ?it/s]#"
Have all models downloaded. This is also required? -> "laion/CLIP-ViT-bigG-14-laion2B-39B-b160k" ?
same error here stuck shard
其实是个网络问题,网络正常的话 耐心等会下载完成; 我看是下载了huggingface的两个文件夹 models--openai--clip-vit-large-patch14,models--laion--CLIP-ViT-bigG-14-laion2B-39B-b160k 在huggingface环境变量 .cache\huggingface\hub 目录
这个到底在下什么,有没有链接,我也卡这里了
I downloaded the model manually and put it in:ComfyUI_windows_portable\ComfyUI\models\ultrapixel but When I run the workflow, I get stuck at Downloading shards: 0%| | 0/2 [00:00<?, ?it/s]#
Same error here... stuck @ "Downloading shards: 0%| | 0/2 [00:00<?, ?it/s]#"
Have all models downloaded. This is also required? -> "laion/CLIP-ViT-bigG-14-laion2B-39B-b160k" ?
Yes, they use: https://github.com/2kpr/ComfyUI-UltraPixel/blob/main/train/train_t2i.py#L143-L144 (And even though that says 'train' that is how they have their code setup, for inference to ref classes in the train files, eventually I can work through their code and switch it up, but that is how they had it.)
How are you all installing ComfyUI-UltraPixel, manually or with the comfyui manager?
I just wiped my whole ComfyUI and tried installing it and this ComfyUI-UltraPixel two ways:
I git cloned ComfyUI, then ran pip -r requirements.txt from it's cloned dir, then I went to the custom_nodes dir, git cloned ComfyUI-UltraPixel, then ran pip -r requirements.txt from it's cloned dir, launched ComfyUI and loaded the workflow-default.json and ran it and it worked without issue.
I git cloned ComfyUI, then ran pip -r requirements.txt from it's cloned dir, then I went to the custom_nodes dir, git cloned https://github.com/ltdrdata/ComfyUI-Manager, restarted ComfyUI, I then used the ComfyUI-Manager from within ComfyUI (see pic below) to install ComfyUI-UltraPixel and loaded the workflow-default.json and ran it and it worked without issue.
这个作者脚本太烂了!模型配置没法切换,模型需要哪些也没详细说明,列一些、不列一些,丢三落四!controlnet/canny.safetensors,有这模型吗?应该是 StableCascade 配套的 Controlnet 吧?还有像上面说的这个 laion/CLIP-ViT-bigG-14-laion2B-39B-b160k 模型,十几个GB,运行 workflow 的时候联网下载,脑残了!这种脚本肯定Bug一大堆。 既然脚本都写出来了,就没耐心稍微注明一下需要哪些东西、如何安装配置?
这个作者脚本太烂了!模型配置没法切换,模型需要哪些也没详细说明,列一些、不列一些,丢三落四!controlnet/canny.safetensors,有这模型吗?应该是 StableCascade 配套的 Controlnet 吧?还有像上面说的这个 laion/CLIP-ViT-bigG-14-laion2B-39B-b160k 模型,十几个GB,运行 workflow 的时候联网下载,脑残了!这种脚本肯定Bug一大堆。 既然脚本都写出来了,就没耐心稍微注明一下需要哪些东西、如何安装配置?
The code used in ComfyUI-UltraPixel is from the official UltraPixel repo https://github.com/catcathh/UltraPixel , so address any issues you have with how it was coded with them (and if you used their repo and their test inference code it would also then download the 10GB clip, etc), I merely wanted to bring the UltraPixel functionality to ComfyUI and to do that I'm of course using their code which results in the things you mention above. I also wanted to release it before say rewriting their entire codebase and rather make continual improvements. As I'm slowly modifying and tweaking what was their codebase, like I just added 10GB/12GB/16GB GPU functionality, etc. Eventually I'll probably just end up rewriting the entire codebase for better integration into ComfyUI, but for now this will have to do.
And yes, controlnet/canny.safetensors is the official controlnet for StableCascade, see here: https://huggingface.co/stabilityai/stable-cascade/tree/main/controlnet
这个作者脚本太烂了!模型配置没法切换,模型需要哪些也没详细说明,列一些、不列一些,丢三落四!controlnet/canny.safetensors,有这模型吗?应该是 StableCascade 配套的 Controlnet 吧?还有像上面说的这个 laion/CLIP-ViT-bigG-14-laion2B-39B-b160k 模型,十几个GB,运行 workflow 的时候联网下载,脑残了!这种脚本肯定Bug一大堆。 既然脚本都写出来了,就没耐心稍微注明一下需要哪些东西、如何安装配置?
As of 7/18, having constructed ComfyUI-UltraPixel using the original code from https://github.com/catcathh/UltraPixel, I'm now going to completely rewrite ComfyUI-UltraPixel such that it has much better integration with ComfyUI's native code vs basically ComfyUI-UltraPixel just being a 'modified wrapper' around the original UltraPixel code. This will use / bring with it the 'standard' ComfyUI prompt/clip handling and model loading, positive/negative weighted prompting, no longer having to download the 10GB text/clip model the original UltraPixel code was/has been (ie the 'downloading shards' you all have seen and had to wait upon), among other features. While of course retaining the ability to work with 10GB/12GB/16GB GPUs, etc.
我们肯定你的成果,但是,既然开源了,我们提出建议,和你交流!只是希望你做的更好!期待你基于 ComfyUI 标准的模式,重构一下代码。官方代码也是很烂!只是为了实现功能,没有优化!运行速度极慢。。。 对于 “bf16” 的问题,肯定应该是个配置项,这个正如我之前说的,我看过,也全部跑完了官方代码,官方配置中 “bf16” 是优先选项罢了,官方支持 "float16/float32",在 ComfyUI 中,弄个 CLIP-ViT-bigG-14-laion2B-39B-b160k 这么大的 CLIP 根本没必要,直接用 ComfyUI 官方的就好了。。。 期待你更优化的作品。。。
I downloaded the model manually and put it in:ComfyUI_windows_portable\ComfyUI\models\ultrapixel
but
When I run the workflow, I get stuck at Downloading shards: 0%| | 0/2 [00:00<?, ?it/s]#