Closed arkinson9 closed 1 week ago
Yes, model name input is a string, so you can source it from anything (that produces strings).
To convert "modelname" to Widget and type the modelname manually in, did not work for me.
This works fine. Post a minimal failing workflow to check if you can't get it to work.
Thank you for your reply. I had already tried this. The workflow itself works fine of course. But in civitai the checkpoint will not be displayed under Resources.
I use this checkpoint in my example: https://civitai.com/models/617609/flux1-dev The downloaded filename is: flux1Dev_v10.safetensors So flux1Dev_v10 as string should work as I supose. But as you can see - Resources are empty.
mmh - really no solution?
I guess it is a question of the civitai Hash that has to be stored somewhere. To manually create a "modelname.sha256" file with the corresponding "civitai SHA256 Hash" in the unet model directory didn't do the trick so far.
Did you try with flux1Dev_v10.safetensors
as the string? That's what Saver gets when the model is loaded with the Saver's checkpoint loader.
The filename is then appended to the checkpoints path comfy variable: https://github.com/alexopus/ComfyUI-Image-Saver/blob/main/nodes.py#L442-L447 to make a full path to the file and grabs a hash for that file.
So either the extension is missing, or the Unet Loader grabs the file not from that location. Or something else is wrong with the path - maybe a subdirectory?
Of course, the produced hash has to match to whatever civitai hashes - does it also only hash the unet model?
Yes I've had already tried the string including the file extension: flux1Dev_v10.safetensors
- but no luck.
I use the normal path without a subdirectory: D:\ComfyUI\ComfyUI\models\unet.
What do you mean with "... does it also only hash the unet model?" Checkpoint models working well with the "Chekpoint Loader With Name" node. And yes I can see that the output that go`s to the Saver node is just a string in the format "checkpointmodelname.saftensors". I have the probem only with the unet models.
I didn`t know mutch about Python, but if i have a look into the nodes.py file:
ckpt_path = folder_paths.get_full_path("checkpoints", ckpt_name)
I would guess ckpt_path only points to \models\checkpoints and not to \models\unet path - right?? I didn`t found anything about the unet path in that file.
Can you upload one of the created images, to see if it contains a hash of the model. The one where you included the .safetensors
?
What do you mean with "... does it also only hash the unet model?"
Did you get the unet file from civitai? Even if your images get hashed correctly, civitai would have to have the exact same model, so that the hashes match and the resource is linked. For example, if Civitai distributes flux model only in it's checkpoint form, then there will be no linkage, because the hashes are different. There are at least two different versions of Flux, one as a single checkpoint file and another with several files.
Thank you very mutch for trying to help. OK - just to have a clear database and to exclude any file mismatch I freshly downloaded the file from civitai:
https://civitai.com/models/617609/flux1-dev there is only the unet verion, the Hash is: 4610115BB0C89560703C892C59AC2742FA821E60EF5871B33493BA544683ABD7, and the filename is: flux1Dev_v10.safetensors.
Here is my simple workflow:
and the console output:
`got prompt model weight dtype torch.float8_e4m3fn, manual cast: torch.bfloat16 model_type FLUX New prompt: test,
highest quality, 32k, intricate details
Using pytorch attention in VAE
Using pytorch attention in VAE
D:\ComfyUI\python_embeded\lib\site-packages\transformers\tokenization_utils_base.py:1601: FutureWarning: clean_up_tokenization_spaces
was not set. It will be set to True
by default. This behavior will be depracted in transformers v4.45, and will be then set to False
by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884
warnings.warn(
Requested to load FluxClipModel_
Loading 1 new model
loaded completely 0.0 4777.53759765625 True
clip missing: ['text_projection.weight']
Requested to load Flux
Loading 1 new model
loaded partially 9775.75151171875 9774.536193847656 0
100%|██████████████████████████████████████████| 10/10 [00:52<00:00, 5.26s/it]
Requested to load AutoencodingEngine
Loading 1 new model
loaded completely 0.0 159.87335777282715 True
Prompt executed in 323.81 seconds`
and the image:
As I can see from the exif data, it is no Hash data includet.
mmh - really no idea? Could you please just confirm that the models from the unet folder works on your side. Cause as mentioned above the
ckpt_path = folder_paths.get_full_path("checkpoints", ckpt_name)
in the nodes.py file seems not to take the unet folder into account.
I pushed an update that will check the unet directory if the filename was not found in checkpoints. Make sure you include full path after unet dir, if there are any subdirectories. e,g, FLUX1/flux1-dev-fp8.safetensors
.
Works perfect now! Thank you very much for improving this usefull node.
Is there a way to add the "modelname" for the checkpoint manually as a string or hash or via air - to display it on civitai?
Unfortunately some Flux models can only be loaded with the "Unet Loader" instead of the "Chekpoint Loader With Name" node. I have tried to use the "Checkpoint Name Selector" node as an input for the modelname - but of course, it can´t find the unet models.
To convert "modelname" to Widget and type the modelname manually in, did not work for me.