stavsap / comfyui-ollama

Apache License 2.0
335 stars 29 forks source link

getting a 'load_duration' and sometimes a 'context' error. #23

Closed Appolonius001 closed 1 month ago

Appolonius001 commented 2 months ago
Error occurred when executing OllamaGenerateAdvance:

'context'

File "C:\V\ComfyUI\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\V\ComfyUI\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\V\ComfyUI\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\V\ComfyUI\ComfyUI\custom_nodes\comfyui-ollama\CompfyuiOllama.py", line 255, in ollama_generate_advance
return (response['response'],response['context'],)
~~~~~~~~^^^^^^^^^^^

and

Error occurred when executing OllamaGenerate:

'load_duration'

File "C:\V\ComfyUI\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\V\ComfyUI\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\V\ComfyUI\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\V\ComfyUI\ComfyUI\custom_nodes\comfyui-ollama\CompfyuiOllama.py", line 129, in ollama_generate
- load_duration: {response["load_duration"]}
~~~~~~~~^^^^^^^^^^^^^^^^^

Vision node seems to work but i get the above error with generate advanced & generate

alexnikospb commented 2 months ago
+ 1,  Today after the update OIlama and Comfyui, started getting this error:

Error occurred when executing OllamaGenerate: 'load_duration'

stavsap commented 2 months ago

pushed a fixed to the print, it seems that the response from ollama is changed, API change?

seems also that context in advance is missing, checking this,

alexnikospb commented 2 months ago

The API has not changed, it connects to the Ollama, but then an error immediately appears. Yesterday everything worked fine with the same circuit and settings. From the console:

HTTP Request: POST http://127.0.0.1:11434/api/generate "HTTP/1.1 200 OK" !!! Exception during processing!!! 'load_duration' Traceback (most recent call last): File "C:\ComfyUI_windows_portable\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "C:\ComfyUI_windows_portable\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "C:\ComfyUI_windows_portable\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) File "C:\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-ollama\CompfyuiOllama.py", line 129, in ollama_generate

KorDum commented 2 months ago

I confirm, the last commit does not fix the situation

alexnikospb commented 2 months ago

I solved the problem by completely reinstalling

KorDum commented 2 months ago

It didn't help me. I also tried ComfyUI Ollama YN, same error there

alexnikospb commented 2 months ago

It didn't help me. I also tried ComfyUI Ollama YN, same error there

Try change debug: disable in Ollama Generate

KorDum commented 2 months ago

Also I uninstalled all python dependencies and reinstalled without cache. The error persisted, the debug flag has no effect on the error :(

KorDum commented 2 months ago

Оh! I got it! The problem is in the Ollama Generate Advance node, there is no error in the Ollama Generate node!

stavsap commented 2 months ago

please update ollama to 0.2.4, it seems their api back to normal

KorDum commented 2 months ago

The problem was observed just 0.2.4, and 0.2.5 fixes the problem. Thanks!