city96 / ComfyUI_NetDist

Run ComfyUI workflows on multiple local GPUs/networked machines.
Apache License 2.0
236 stars 26 forks source link

Not json compliant bug. #1

Closed Ferniclestix closed 6 months ago

Ferniclestix commented 12 months ago

Keeps throwing out of range float errors are not json compliant.

!!! Exception during processing !!!
Traceback (most recent call last):
  File "C:\Users\penpen\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\models.py", line 511, in prepare_body
    body = complexjson.dumps(json, allow_nan=False)
  File "C:\Users\penpen\AppData\Local\Programs\Python\Python310\lib\json\__init__.py", line 238, in dumps
    **kw).encode(obj)
  File "C:\Users\penpen\AppData\Local\Programs\Python\Python310\lib\json\encoder.py", line 199, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "C:\Users\penpen\AppData\Local\Programs\Python\Python310\lib\json\encoder.py", line 257, in iterencode
    return _iterencode(o, 0)
ValueError: Out of range float values are not JSON compliant

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\othercomfy\origin\ComfyUI\execution.py", line 145, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
  File "D:\othercomfy\origin\ComfyUI\execution.py", line 75, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
  File "D:\othercomfy\origin\ComfyUI\execution.py", line 68, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
  File "D:\othercomfy\origin\ComfyUI\custom_nodes\ComfyUI_NetDist\nodes\remote_control.py", line 177, in queue_on_remote
    ar = requests.post(remote_url+"prompt", json=data)
  File "C:\Users\penpen\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\api.py", line 115, in post
    return request("post", url, data=data, json=json, **kwargs)
  File "C:\Users\penpen\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\api.py", line 59, in request
    return session.request(method=method, url=url, **kwargs)
  File "C:\Users\penpen\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py", line 575, in request
    prep = self.prepare_request(req)
  File "C:\Users\penpen\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py", line 486, in prepare_request
    p.prepare(
  File "C:\Users\penpen\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\models.py", line 371, in prepare
    self.prepare_body(data, files, json)
  File "C:\Users\penpen\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\models.py", line 513, in prepare_body
    raise InvalidJSONError(ve, request=self)
requests.exceptions.InvalidJSONError: Out of range float values are not JSON compliant
Ferniclestix commented 12 months ago

On further testing, appear to happen once and then keeps happening even if you revert your workflow.

Really to have usefullness it need to have the ability to send a json workflow to run to the other server rather than send a copy of the current workflow backwards, which makes what you can do with it extremely limited. If i could just make a workflow for it and have it send that to the different servers so they run different processes oon demand it would be so much more useful.

city96 commented 12 months ago

Really to have usefullness it need to have the ability to send a json workflow to run to the other server

This would make a lot of sense since it would make it possible to have the network distribution workflow completely separate. I think the biggest challenge would be seeds/batch sizes, especially if a workflow has more than one.

I think my initial approach was a bit over-complicating it, I'll see if I can rewrite this when I get the time.

Ferniclestix commented 12 months ago

I think like a node that accepts a bunch of inputs and a json load path in the main workflow. stuff like prompts, seed, images, ect. then you send those and it loads your second workflow that starts with an output version of that node which sends the seed, prompt and all that wherever would be good.

not sure how i'd handle image retrieval, like having the remote workflows waiting on eachother for completion would basically destroy alot of the usefulness of this because you remove the paralell processing ability then... a conundrum.

city96 commented 6 months ago

Guess I'll close this for now. I rewrote a bunch of stuff so maybe it won't fail anymore, though I couldn't reproduce this one originally either IIRC.

You can also follow #7 for using saved workflows. It works now but needs more stuff to be fully useful i.e. changing the workflow per client.