Chaoses-Ib / ComfyScript

A Python frontend and library for ComfyUI
https://discord.gg/arqJbtEg7w
MIT License
431 stars 24 forks source link

AssertionError when using `load()` in example notebook #9

Closed KaijuML closed 10 months ago

KaijuML commented 10 months ago

Hey, thanks very much for your work, it looks awesome.

I hit an error, when trying to make the example notebook runtime.ipynb work:

from script.runtime import *

load()
Click here to see the error stack trace ``` Nodes: 529 --------------------------------------------------------------------------- AssertionError Traceback (most recent call last) Cell In[1], line 4 1 from script.runtime import * 3 # load('http://127.0.0.1:8188/') ----> 4 load() 6 # Nodes can only be imported after load() 7 from script.runtime.nodes import * File /mnt/c/Users/clementr/Projects/stablediffusion/ComfyUI/custom_nodes/ComfyScript/script/runtime/__init__.py:18, in load(api_endpoint, vars, watch, save_script_source) 17 def load(api_endpoint: str = 'http://127.0.0.1:8188/', vars: dict | None = None, watch: bool = True, save_script_source: bool = True): ---> 18 asyncio.run(_load(api_endpoint, vars, watch, save_script_source)) File ~/miniconda3/envs/comfyui/lib/python3.10/site-packages/nest_asyncio.py:30, in _patch_asyncio..run(main, debug) 28 task = asyncio.ensure_future(main) 29 try: ---> 30 return loop.run_until_complete(task) 31 finally: 32 if not task.done(): File ~/miniconda3/envs/comfyui/lib/python3.10/site-packages/nest_asyncio.py:98, in _patch_loop..run_until_complete(self, future) 95 if not f.done(): 96 raise RuntimeError( 97 'Event loop stopped before Future completed.') ---> 98 return f.result() File ~/miniconda3/envs/comfyui/lib/python3.10/asyncio/futures.py:201, in Future.result(self) 199 self.__log_traceback = False 200 if self._exception is not None: --> 201 raise self._exception.with_traceback(self._exception_tb) 202 return self._result File ~/miniconda3/envs/comfyui/lib/python3.10/asyncio/tasks.py:232, in Task.__step(***failed resolving arguments***) 228 try: 229 if exc is None: 230 # We use the `send` method directly, because coroutines 231 # don't have `__iter__` and `__next__` methods. --> 232 result = coro.send(None) 233 else: 234 result = coro.throw(exc) File /mnt/c/Users/clementr/Projects/stablediffusion/ComfyUI/custom_nodes/ComfyScript/script/runtime/__init__.py:29, in _load(api_endpoint, vars, watch, save_script_source) 26 nodes_info = await api._get_nodes_info() 27 print(f'Nodes: {len(nodes_info)}') ---> 29 nodes.load(nodes_info, vars) 31 # TODO: Stop watch if watch turns to False 32 if watch: File /mnt/c/Users/clementr/Projects/stablediffusion/ComfyUI/custom_nodes/ComfyScript/script/runtime/nodes.py:14, in load(nodes_info, vars) 12 fact = VirtualRuntimeFactory() 13 for node_info in nodes_info.values(): ---> 14 fact.add_node(node_info) 16 globals().update(fact.vars()) 17 __all__.extend(fact.vars().keys()) File /mnt/c/Users/clementr/Projects/stablediffusion/ComfyUI/custom_nodes/ComfyScript/script/runtime/factory.py:218, in RuntimeFactory.add_node(self, info) 215 config = {} 216 inputs.append(f'{name}: {type_and_hint(type_info, name, optional, config.get("default"))[1]}') --> 218 output_types = [type_and_hint(type, output=True)[0] for type in info['output']] 220 outputs = len(info['output']) 221 if outputs >= 2: File /mnt/c/Users/clementr/Projects/stablediffusion/ComfyUI/custom_nodes/ComfyScript/script/runtime/factory.py:218, in (.0) 215 config = {} 216 inputs.append(f'{name}: {type_and_hint(type_info, name, optional, config.get("default"))[1]}') --> 218 output_types = [type_and_hint(type, output=True)[0] for type in info['output']] 220 outputs = len(info['output']) 221 if outputs >= 2: File /mnt/c/Users/clementr/Projects/stablediffusion/ComfyUI/custom_nodes/ComfyScript/script/runtime/factory.py:117, in RuntimeFactory.add_node..type_and_hint(type_info, name, optional, default, output) 115 if isinstance(type_info, list): 116 if output: print(type_info) --> 117 assert not output 118 if is_bool_enum(type_info): 119 t = bool AssertionError: ```

From what i can tell, everything looks ok, my comfyui is running and works correctly.

When I investigate a bit i can add a line to script/runtime/factory.py>RuntimeFactory>add_node (at line L116):

[...]
L115            if isinstance(type_info, list):
L116                if output: print(type_info)
L117                assert not output
[...]

and i get the following list being printed, which appears to be the models i have downloaded from civitai

['cyberrealistic_v41BackToBasics.safetensors', 'dreamshaper_8.safetensors', 'realisticVisionV60B1_v20Novae.safetensors']

I don't understand the code enough to debug further, could you help me please?

Thanks very much, Clement

Chaoses-Ib commented 10 months ago

Can you print(info) and put the output here? The error is because a custom node is an output node and returns an enum at the same time. ComfyScript can't determine what enum name to use in such a case.

KaijuML commented 10 months ago

Hey thanks for the reply. Here's the output you asked for:

click here to show/mask the output ``` { "input": { "required": { "ckpt_name": [ [ "cyberrealistic_v41BackToBasics.safetensors", "dreamshaper_8.safetensors", "realisticVisionV60B1_v20Novae.safetensors" ] ] }, "optional": { "vae_name": [ [ "baked VAE", "vae-ft-mse-840000-ema-pruned.safetensors" ], { "default": "baked VAE" } ], "model_version": [ [ "SDv1 512px", "SDv2 768px", "SDXL 1024px" ], { "default": "SDv1 512px" } ], "config_name": [ [ "none", "anything_v3.yaml", "v1-inference.yaml", "v1-inference_clip_skip_2.yaml", "v1-inference_clip_skip_2_fp16.yaml", "v1-inference_fp16.yaml", "v1-inpainting-inference.yaml", "v2-inference-v.yaml", "v2-inference-v_fp32.yaml", "v2-inference.yaml", "v2-inference_fp32.yaml", "v2-inpainting-inference.yaml" ], { "default": "none" } ], "seed": [ "INT", { "default": -1, "min": -3, "max": 18446744073709551615 } ], "steps": [ "INT", { "default": 20, "min": 1, "max": 10000 } ], "refiner_start": [ "FLOAT", { "default": 0.8, "min": 0.0, "max": 1.0, "step": 0.01 } ], "cfg": [ "FLOAT", { "default": 8.0, "min": 0.0, "max": 100.0, "step": 0.5, "round": 0.01 } ], "sampler_name": [ [ "euler", "euler_ancestral", "heun", "heunpp2", "dpm_2", "dpm_2_ancestral", "lms", "dpm_fast", "dpm_adaptive", "dpmpp_2s_ancestral", "dpmpp_sde", "dpmpp_sde_gpu", "dpmpp_2m", "dpmpp_2m_sde", "dpmpp_2m_sde_gpu", "dpmpp_3m_sde", "dpmpp_3m_sde_gpu", "ddpm", "lcm", "ddim", "uni_pc", "uni_pc_bh2" ] ], "scheduler": [ [ "normal", "karras", "exponential", "sgm_uniform", "simple", "ddim_uniform" ] ], "positive_ascore": [ "FLOAT", { "default": 6.0, "min": 0.0, "max": 1000.0, "step": 0.01 } ], "negative_ascore": [ "FLOAT", { "default": 6.0, "min": 0.0, "max": 1000.0, "step": 0.01 } ], "aspect_ratio": [ [ "custom", "1:1 - 512x512 | 768x768 | 1024x1024", "4:3 - 576x448 | 864x672 | 1152x896", "3:4 - 448x576 | 672x864 | 896x1152", "3:2 - 608x416 | 912x624 | 1216x832", "2:3 - 416x608 | 624x912 | 832x1216", "16:9 - 672x384 | 1008x576 | 1344x768", "9:16 - 384x672 | 576x1008 | 768x1344", "21:9 - 768x320 | 1152x480 | 1536x640", "9:21 - 320x768 | 480x1152 | 640x1536" ], { "default": "custom" } ], "width": [ "INT", { "default": 512, "min": 1, "max": 8192, "step": 8 } ], "height": [ "INT", { "default": 512, "min": 1, "max": 8192, "step": 8 } ], "batch_size": [ "INT", { "default": 1, "min": 1, "max": 4096 } ] } }, "output": [ [ "cyberrealistic_v41BackToBasics.safetensors", "dreamshaper_8.safetensors", "realisticVisionV60B1_v20Novae.safetensors" ], "MODEL", "CLIP", "VAE", "INT", "INT", "INT", "FLOAT", [ "euler", "euler_ancestral", "heun", "heunpp2", "dpm_2", "dpm_2_ancestral", "lms", "dpm_fast", "dpm_adaptive", "dpmpp_2s_ancestral", "dpmpp_sde", "dpmpp_sde_gpu", "dpmpp_2m", "dpmpp_2m_sde", "dpmpp_2m_sde_gpu", "dpmpp_3m_sde", "dpmpp_3m_sde_gpu", "ddpm", "lcm", "ddim", "uni_pc", "uni_pc_bh2" ], [ "normal", "karras", "exponential", "sgm_uniform", "simple", "ddim_uniform" ], "FLOAT", "FLOAT", "INT", "INT", "INT", "STRING" ], "output_is_list": [ false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false ], "output_name": [ "MODEL_NAME", "MODEL", "CLIP", "VAE", "SEED", "STEPS", "REFINER_START_STEP", "CFG", "SAMPLER_NAME", "SCHEDULER", "POSITIVE_ASCORE", "NEGATIVE_ASCORE", "WIDTH", "HEIGHT", "BATCH_SIZE", "PARAMETERS" ], "name": "SDParameterGenerator", "display_name": "SD Parameter Generator", "description": "", "category": "SD Prompt Reader", "output_node": false } ```
Chaoses-Ib commented 10 months ago

Fixed in v0.3.0.