Open hartmark opened 1 month ago
This is the workflow in the example:
I have to review the code and add same !
Meet the same error! Follow this update
Following, super handy for XY LoRA testing.
Hi, any update on this? Would love to use your loaders with flux, regular workflow is such a pain :( Thanks for your hard work BTW.
Same issue here, would love to use your epic nodes!
The model from this page: https://comfyanonymous.github.io/ComfyUI_examples/flux/?ref=blog.comfy.org#simple-to-use-fp8-checkpoint-version https://huggingface.co/Comfy-Org/flux1-dev/blob/main/flux1-dev-fp8.safetensors
Error occurred when executing Efficient Loader:
't5xxl'
File "/comfyui/execution.py", line 152, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "/comfyui/execution.py", line 82, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "/comfyui/execution.py", line 75, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) File "/comfyui/custom_nodes/efficiency-nodes-comfyui/efficiency_nodes.py", line 179, in efficientloader encode_prompts(positive, negative, token_normalization, weight_interpretation, clip, clip_skip, File "/comfyui/custom_nodes/efficiency-nodes-comfyui/efficiency_nodes.py", line 80, in encode_prompts positive_encoded = bnk_adv_encode.AdvancedCLIPTextEncode().encode(clip, positive_prompt, token_normalization, weight_interpretation)[0] File "/comfyui/custom_nodes/efficiency-nodes-comfyui/py/bnk_adv_encode.py", line 312, in encode embeddings_final, pooled = advanced_encode(clip, text, token_normalization, weight_interpretation, w_max=1.0, File "/comfyui/custom_nodes/efficiency-nodes-comfyui/py/bnk_adv_encode.py", line 262, in advanced_encode return advanced_encode_from_tokens(tokenized['l'], File "/comfyui/custom_nodes/efficiency-nodes-comfyui/py/bnk_adv_encode.py", line 183, in advanced_encode_from_tokens weighted_emb, pooled_base = encode_func(weighted_tokens) File "/comfyui/custom_nodes/efficiency-nodes-comfyui/py/bnk_adv_encode.py", line 265, in lambda x: (clip.encode_from_tokens({'l': x}), None), File "/comfyui/comfy/sd.py", line 116, in encode_from_tokens o = self.cond_stage_model.encode_token_weights(tokens) File "/comfyui/comfy/text_encoders/flux.py", line 55, in encode_token_weights token_weight_pairs_t5 = token_weight_pairs["t5xxl"]
This seems a bit related to #201 Is there any plan on supporting SD3 and/or flux models?