comfyanonymous / ComfyUI

The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
https://www.comfy.org/
GNU General Public License v3.0
50.64k stars 5.32k forks source link

Prompt validation fails with partially working prompt, leading to stuck main thread #4529

Open jkrauss82 opened 3 weeks ago

jkrauss82 commented 3 weeks ago

Expected Behavior

Prompt validation should fail and return 400 to client as usual

Actual Behavior

Seeing this stack trace and UI is not reacting anymore. Prompt worker is still running and receiving requests but nothing else works. See debug logs for stack trace

Steps to Reproduce

see the python code below and the attached workflow for a minimal example to reproduce the error.

we delete the one checkpoint loader in the json, then submit it leading to the observed state

import json
import sys
from urllib import request

workflow = "workflow_api_min_example.json"

prompt = None
with open(workflow) as f:
    prompt = json.load(f)
if prompt == None:
    print("Error, could not read workflow from file, abort.")
    sys.exit(1)

def queue_prompt(prompt):
    p = {"prompt": prompt}
    data = json.dumps(p).encode('utf-8')
    req =  request.Request("http://127.0.0.1:7860/prompt", data=data)
    request.urlopen(req)

del prompt["4"]

queue_prompt(prompt)

workflow_api_min_example.json

Debug Logs

$ ./start.sh
Total VRAM 16081 MB, total RAM 31952 MB
pytorch version: 2.4.0+cu121
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 4060 Ti : native
Using pytorch cross attention
[Prompt Server] web root: /home/user/workspace/ComfyUI/web
/home/user/workspace/ComfyUI/venv/lib/python3.10/site-packages/kornia/feature/lightglue.py:44: FutureWarning: `torch.cuda.amp.custom_fwd(args...)` is deprecated. Please use `torch.amp.custom_fwd(args..., device_type='cuda')` instead.
@torch.cuda.amp.custom_fwd(cast_inputs=torch.float32)
Skipping loading of custom nodes
Starting server

To see the GUI go to: http://0.0.0.0:7860
got prompt
Failed to validate prompt for output 10:
* LoraLoader 11:
- Exception when validating inner node: '4'
* VAEDecode 8:
- Exception when validating inner node: '4'
Output will be ignored
Exception in thread Thread-1 (prompt_worker):
Traceback (most recent call last):
File "/usr/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
self.run()
File "/usr/lib/python3.10/threading.py", line 953, in run
self._target(*self._args, **self._kwargs)
File "/home/user/workspace/ComfyUI/main.py", line 121, in prompt_worker
e.execute(item[2], prompt_id, item[3], item[4])
File "/home/user/workspace/ComfyUI/execution.py", line 468, in execute
cache.set_prompt(dynamic_prompt, prompt.keys(), is_changed_cache)
File "/home/user/workspace/ComfyUI/comfy_execution/caching.py", line 136, in set_prompt
self.cache_key_set = self.key_class(dynprompt, node_ids, is_changed_cache)
File "/home/user/workspace/ComfyUI/comfy_execution/caching.py", line 68, in __init__
self.add_keys(node_ids)
File "/home/user/workspace/ComfyUI/comfy_execution/caching.py", line 78, in add_keys
self.keys[node_id] = self.get_node_signature(self.dynprompt, node_id)
File "/home/user/workspace/ComfyUI/comfy_execution/caching.py", line 83, in get_node_signature
ancestors, order_mapping = self.get_ordered_ancestry(dynprompt, node_id)
File "/home/user/workspace/ComfyUI/comfy_execution/caching.py", line 111, in get_ordered_ancestry
self.get_ordered_ancestry_internal(dynprompt, node_id, ancestors, order_mapping)
File "/home/user/workspace/ComfyUI/comfy_execution/caching.py", line 123, in get_ordered_ancestry_internal
self.get_ordered_ancestry_internal(dynprompt, ancestor_id, ancestors, order_mapping)
File "/home/user/workspace/ComfyUI/comfy_execution/caching.py", line 123, in get_ordered_ancestry_internal
self.get_ordered_ancestry_internal(dynprompt, ancestor_id, ancestors, order_mapping)
File "/home/user/workspace/ComfyUI/comfy_execution/caching.py", line 115, in get_ordered_ancestry_internal
inputs = dynprompt.get_node(node_id)["inputs"]
File "/home/user/workspace/ComfyUI/comfy_execution/graph.py", line 28, in get_node
raise NodeNotFoundError(f"Node {node_id} not found")
comfy_execution.graph.NodeNotFoundError: Node 4 not found

Other

No response

frankchieng commented 3 weeks ago

截图 2024-08-24 14-17-18 API in main.py prompt_worker have bug either