yolain / ComfyUI-Easy-Use

In order to make it easier to use the ComfyUI, I have made some optimizations and integrations to some commonly used nodes.
GNU General Public License v3.0
537 stars 34 forks source link

版本:9639c3a开始不能正常采样 #198

Closed liulsg closed 3 weeks ago

liulsg commented 3 weeks ago

版本:v1.1.9 可以正常工作 以按要求重新安装依赖库 image

bigmoon2023 commented 3 weeks ago

同样的问题,报错: Error occurred when executing easy kSampler: 'transformer_options'.

另外,作者可以在设置文件默认分辨率里加上 (720, 1080),(1080, 720), (832, 1216), (1216, 832),这四个SDXL模型常用分辨率么,虽然也可以自己设置,但是每次更新就把自己加的分辨率给覆盖掉了。

yolain commented 3 weeks ago

方便的话,传一下完整的报红信息呢。我这边两个不同系统环境下都运行正常.

liulsg commented 3 weeks ago

@yolain image

[EasyUse] 加载完毕... Requested to load SDXL Loading 1 new model !!! Exception during processing!!! 'transformer_options' Traceback (most recent call last): File "G:\AI\ComfyUI_M\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\easyNodes.py", line 4792, in simple return super().run(pipe, None, None, None, None, None, image_output, link_id, save_prefix, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\easyNodes.py", line 4761, in run return process_sample_state(pipe, samp_model, samp_clip, samp_samples, samp_vae, samp_seed, samp_positive, samp_negative, steps, start_step, last_step, cfg, sampler_name, scheduler, denoise, image_output, link_id, save_prefix, tile_size, prompt, extra_pnginfo, my_unique_id, preview_latent, force_full_denoise, disable_noise, samp_custom) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\easyNodes.py", line 4545, in process_sample_state samp_samples = sampler.common_ksampler(samp_model, samp_seed, steps, cfg, sampler_name, scheduler, samp_positive, samp_negative, samp_samples, denoise=denoise, preview_latent=preview_latent, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, disable_noise=disable_noise) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI-Easy-Use\py\libs\sampler.py", line 110, in common_ksampler samples = kSampler.sample(noise, positive, negative, cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=None, callback=callback, disable_pbar=disable_pbar, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 1446, in KSampler_sample return _KSampler_sample(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\comfy\samplers.py", line 761, in sample return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 1469, in sample return _sample(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\comfy\samplers.py", line 663, in sample return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\comfy\samplers.py", line 640, in sample self.inner_model, self.conds, self.loaded_models = comfy.sampler_helpers.prepare_sampling(self.model_patcher, noise.shape, self.conds) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\comfy\sampler_helpers.py", line 64, in prepare_sampling comfy.model_management.load_models_gpu([model] + models, model.memory_required([noise_shape[0] 2] + list(noise_shape[1:])) + inference_memory) File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\Fooocus_Nodes\py\modules\patch.py", line 447, in patched_load_models_gpu y = comfy.model_management.load_models_gpu_origin(args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\comfy\model_management.py", line 464, in load_models_gpu cur_loaded_model = loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\comfy\model_management.py", line 293, in model_load self.model.model_patches_to(self.device) File "G:\AI\ComfyUI_M\ComfyUI\comfy\model_patcher.py", line 184, in model_patches_to to = self.model_options["transformer_options"]


KeyError: 'transformer_options'

Prompt executed in 77.20 seconds
liulsg commented 3 weeks ago

我测试方法: 1.将argostranslate 1.9.3 升级到argostranslate-1.9.6
目的:支持sentencepiece 0.2.0 结果:仍报错 2.屏蔽插件“ComfyUI_smZNodes”,
结果:可正常运行。

yolain commented 3 weeks ago

我测试方法: 1.将argostranslate 1.9.3 升级到argostranslate-1.9.6 目的:支持sentencepiece 0.2.0 结果:仍报错 2.屏蔽插件“ComfyUI_smZNodes”, 结果:可正常运行。

smZ 5-15那版我运行是正常,我多补了下参数,再试试

liulsg commented 3 weeks ago

@yolain 已正常运行,作者辛苦了,我工作流开始迁移到您的工作流节点上,基本每个功能都会测试。 如果发现bug会不断提交,打扰之处还望作者理解。

liulsg commented 3 weeks ago

@yolain 采用A1111方式-->提示词:[dog:cow:4] 报错,其他如交替格式:[dog|cow]则正常

如果采用:ComfyUI_ADV_CLIP_emb 调用A1111则不会报错。

liulsg commented 3 weeks ago

错误信息: Error occurred when executing smZ CLIPTextEncode:

float() argument must be a string or a real number, not 'Tree'

File "G:\AI\ComfyUI_M\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI_smZNodes\nodes.py", line 87, in encode result = run(params) ^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 754, in run cond, pooled = clip_clone.encode_from_tokens(tokens, True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\comfy\sd.py", line 135, in encode_from_tokens cond, pooled = self.cond_stage_model.encode_token_weights(tokens) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 488, in encode_token_weights g_out, g_pooled = self.clip_g.encode_token_weights(token_weight_pairs_g, steps, current_step, multi) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 442, in encode_token_weights if multi: schedules = prompt_parser.get_multicond_learned_conditioning(model_hijack.cond_stage_model, texts, steps, None, opts.use_old_scheduling) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\prompt_parser.py", line 270, in get_multicond_learned_conditioning learned_conditioning = get_learned_conditioning(model, prompt_flat_list, steps, hires_steps, use_old_scheduling) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\prompt_parser.py", line 187, in get_learned_conditioning prompt_schedules = get_learned_conditioning_prompt_schedules(prompts, steps, hires_steps, use_old_scheduling) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\prompt_parser.py", line 142, in get_learned_conditioning_prompt_schedules promptdict = {prompt: get_schedule(prompt) for prompt in set(prompts)} ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\prompt_parser.py", line 142, in promptdict = {prompt: get_schedule(prompt) for prompt in set(prompts)} ^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\prompt_parser.py", line 140, in get_schedule return [[t, at_step(t, tree)] for t in collect_steps(steps, tree)] ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\prompt_parser.py", line 106, in collect_steps CollectSteps().visit(tree) File "G:\AI\ComfyUI_M\python_miniconda\Lib\site-packages\lark\visitors.py", line 316, in visit self._call_userfunc(subtree) File "G:\AI\ComfyUI_M\python_miniconda\Lib\site-packages\lark\visitors.py", line 294, in _call_userfunc return getattr(self, tree.data, self.default)(tree) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\AI\ComfyUI_M\ComfyUI\custom_nodes\ComfyUI_smZNodes\modules\prompt_parser.py", line 91, in scheduled v = float(s)

yolain commented 3 weeks ago

@yolain

采用A1111方式-->提示词:[dog:cow:4] 报错,其他如交替格式:[dog|cow]则正常

如果采用:ComfyUI_ADV_CLIP_emb 调用A1111则不会报错。

你smzNode版本的问题

liulsg commented 3 weeks ago

@yolain 我用的最后的版本:d8fff13---->我卸载重装下试试

liulsg commented 3 weeks ago

@yolain 问题已检测到。 提示词[dog:cow:4]中的 ":" 切换成中字符下输入不会报错。如果是英文字符的冒号且开启A1111会报错。 2.经过最新测试。 comfyui:”clip文本“目前同样可以实现提示词分步和提示词交替(不知道啥时候增加的)。所以开启不开启A1111好像意义不大了!

yolain commented 3 weeks ago
截屏2024-06-07 16 15 22
liulsg commented 3 weeks ago

因为英文字符报错,所以直接中文字符替换掉其中的冒号。 image