lks-ai / anynode

A Node for ComfyUI that does what you ask it to do
MIT License
426 stars 27 forks source link

Error occurred when executing AnyNodeLocal: unterminated string literal (detected at line 1) (, line 1) #5

Closed AugmentedRealityCat closed 1 month ago

AugmentedRealityCat commented 1 month ago

Anything I try to do with the Local version of Anynode now gives me the error message in the title. In the terminal window, the last line is: SyntaxError: unterminated string literal (detected at line 1)

In Comfy's web UI, I get this error window:

Screenshot 2024-05-29 at 04-59-21 ComfyUI

It looks like the LLM actually manages to create some valid-looking code at first glance, but it then fails to execute it. It fails just after Here's a Python function that takes in a tensor representing an image and outputs the same image but with inverted brightness:

In the test below I was trying to invert an image. The Anynode prompt was I want you to output the image with inverted brightness. I was using LLAMA3 (Meta-Llama-3-8B-Instruct).

I tried to run it at least 10 times (more likely over 20 !) just to make sure code-autocorrect would not come to save me. It did not.

NOTE: since the error log contains the same symbols as those used on this web page to highlight code parts, the code I am pasting here was split in two parts. It should really be one single error message, containing both parts together as one.

Starting server
To see the GUI go to: http://127.0.0.1:8188
To see the GUI go to: https://127.0.0.1:8189
FETCH DATA from: D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Manager\extension-node-map.json [DONE]
Error: OpenAI API key is invalid OpenAI features wont work for you
QualityOfLifeSuit_Omar92::NSP ready
#read_workflow_json_files_all D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-mixlab-nodes\app\
got prompt
[rgthree] Using rgthree's optimized recursive execution.
[rgthree] First run patching recursive_output_delete_if_changed and recursive_will_execute.
[rgthree] Note: If execution seems broken due to forward ComfyUI changes, you can disable the optimization from rgthree settings in ComfyUI.
Last Error: None
Generating Node function...
INPUT tensor([[[[0.4902, 0.5294, 0.4863],
          [0.4941, 0.5333, 0.4902],
          [0.5020, 0.5412, 0.4980],
          ...,
          [0.3098, 0.2863, 0.2392],
          [0.3098, 0.2863, 0.2392],
          [0.3098, 0.2863, 0.2392]],

         [[0.4902, 0.5294, 0.4863],
          [0.4941, 0.5333, 0.4902],
          [0.5020, 0.5412, 0.4980],
          ...,
          [0.3098, 0.2863, 0.2392],
          [0.3098, 0.2863, 0.2392],
          [0.3098, 0.2863, 0.2392]],

         [[0.4902, 0.5294, 0.4863],
          [0.4941, 0.5333, 0.4902],
          [0.5020, 0.5412, 0.4980],
          ...,
          [0.3098, 0.2863, 0.2392],
          [0.3098, 0.2863, 0.2392],
          [0.3098, 0.2863, 0.2392]],

         ...,

         [[0.4745, 0.5255, 0.4549],
          [0.4863, 0.5373, 0.4667],
          [0.5020, 0.5529, 0.4824],
          ...,
          [0.4902, 0.3843, 0.3412],
          [0.5020, 0.3961, 0.3529],
          [0.4902, 0.3843, 0.3412]],

         [[0.4745, 0.5255, 0.4549],
          [0.4863, 0.5373, 0.4667],
          [0.5020, 0.5529, 0.4824],
          ...,
          [0.4078, 0.3020, 0.2588],
          [0.3765, 0.2706, 0.2275],
          [0.4039, 0.2980, 0.2549]],

         [[0.4745, 0.5255, 0.4549],
          [0.4863, 0.5373, 0.4667],
          [0.5020, 0.5529, 0.4824],
          ...,
          [0.4392, 0.3333, 0.2902],
          [0.3686, 0.2627, 0.2196],
          [0.3882, 0.2824, 0.2392]]]])
LE: None
{'id': 'chatcmpl-688', 'object': 'chat.completion', 'created': 1716971910, 'model': 'llama3', 'system_fingerprint': 'fp_ollama', 'choices': [{'index': 0, 'message': {'role': 'assistant', 'content': "Here's a Python function that takes in a tensor representing an image and outputs the same image but with inverted brightness:\n\n```python\nimport torch\n\ndef generated_function(input_data):\n    def invert_brightness(image_tensor):\n        return 1 - image_tensor\n\n    return invert_brightness(input_data)\n```\n\nThis code defines a function `invert_brightness` that subtracts each pixel value from 1, effectively inverting the brightness of the image. The main function `generated_function` applies this transformation to the input tensor and returns the result."}, 'finish_reason': 'stop'}], 'usage': {'prompt_tokens': 470, 'completion_tokens': 107, 'total_tokens': 577}}
Imports in code: ['import torch']
Stored script:
Here's a Python function that takes in a tensor representing an image and outputs the same image but with inverted brightness:

def generated_function(input_data):
    def invert_brightness(image_tensor):
        return 1 - image_tensor

    return invert_brightness(input_data)
This code defines a function `invert_brightness` that subtracts each pixel value from 1, effectively inverting the brightness of the image. The main function `generated_function` applies this transformation to the input tensor and returns the result.
An error occurred:
Traceback (most recent call last):
  File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode\nodes\any.py", line 221, in safe_exec
    exec(code_string, globals_dict, locals_dict)
  File "<string>", line 1
    Here's a Python function that takes in a tensor representing an image and outputs the same image but with inverted brightness:
        ^
SyntaxError: unterminated string literal (detected at line 1)
!!! Exception during processing!!! unterminated string literal (detected at line 1) (<string>, line 1)
Traceback (most recent call last):
  File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 151, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 81, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 74, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode\nodes\any.py", line 260, in go
    raise e
  File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode\nodes\any.py", line 252, in go
    self.safe_exec(self.script, globals_dict, locals_dict)
  File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode\nodes\any.py", line 225, in safe_exec
    raise e
  File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\anynode\nodes\any.py", line 221, in safe_exec
    exec(code_string, globals_dict, locals_dict)
  File "<string>", line 1
    Here's a Python function that takes in a tensor representing an image and outputs the same image but with inverted brightness:
        ^
SyntaxError: unterminated string literal (detected at line 1)

Prompt executed in 6.67 seconds
AugmentedRealityCat commented 1 month ago

If I try to run a simple workflow where I multiply an integer by 5 (as per the default anynode prompt), everything works well, both with Llama3 and Mistral. No error message whatsoever. Here is the log in case it has something interesting for you:

Generating Node function...
INPUT 5
LE: None
{'id': 'chatcmpl-975', 'object': 'chat.completion', 'created': 1716973829, 'model': 'mistral', 'system_fingerprint': 'fp_ollama', 'choices': [{'index': 0, 'message': {'role': 'assistant', 'content': ' ```python\nimport math\ndef generated_function(input_data):\n    def multiply_by_five(x):\n        return x * 5\n    return multiply_by_five(input_data)\n```'}, 'finish_reason': 'stop'}], 'usage': {'prompt_tokens': 504, 'completion_tokens': 52, 'total_tokens': 556}}
Imports in code: ['import math']
Stored script:
def generated_function(input_data):
    def multiply_by_five(x):
        return x * 5
    return multiply_by_five(input_data)
Function result: 25
Prompt executed in 4.62 seconds
got prompt
[rgthree] Using rgthree's optimized recursive execution.
Last Error: None
Function result: 25
Prompt executed in 0.00 seconds

I'll make more tests and see if I can get it to affect something else than a numeric value.

If you want me to test anything specifically, just let me know.

lks-ai commented 1 month ago

Fixed in latest update.