griptape-ai / ComfyUI-Griptape

A suite of Griptape nodes for ComfyUI
Apache License 2.0
42 stars 4 forks source link

Fail to access local Ollama #43

Closed aicoder2048 closed 12 hours ago

aicoder2048 commented 4 days ago

I am using node of "Griptape Create: Agent". It works with default config (OpenAI), but fails when connected to node "Griptape Agent Config: Ollama".

I am on Windows 11, and have installed Ollama with llm models. Ollama is working independently. I have made sure that Ollama has started by mannually started Ollama at the terminal with "ollama serve".

It looks like that griptape node of ollama could not connect to the local ollama (http://0.0.0.0:11434) somehow.


The error message shown on the ComfyUI terminal as:

"<RetryCallState 2453721244368: attempt #1; slept for 0.0; last result: failed (ConnectError [WinError 10049] invalid address within its context)>"

Traceback (most recent call last): File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\griptape\tasks\base_task.py", line 135, in execute self.output = self.run() ^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\griptape\tasks\prompt_task.py", line 63, in run self.output = self.prompt_driver.run(self.prompt_stack) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\griptape\drivers\prompt\base_prompt_driver.py", line 62, in run for attempt in self.retrying(): File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\tenacity__init.py", line 347, in iter do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\tenacity__init__.py", line 325, in iter raise retry_exc.reraise() ^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\tenacity__init__.py", line 158, in reraise raise self.last_attempt.result() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\concurrent\futures_base.py", line 449, in result return self.get_result() ^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\concurrent\futures_base.py", line 401, in get_result raise self._exception File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\griptape\drivers\prompt\base_prompt_driver.py", line 74, in run result = self.try_run(prompt_stack) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\griptape\drivers\prompt\ollama_prompt_driver.py", line 50, in try_run response = self.client.chat(self._base_params(prompt_stack)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\ollama_client.py", line 180, in chat return self._request_stream( ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\ollama_client.py", line 98, in _request_stream return self._stream(*args, *kwargs) if stream else self._request(args, kwargs).json() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\ollama_client.py", line 69, in _request response = self._client.request(method, url, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\httpx_client.py", line 827, in request return self.send(request, auth=auth, follow_redirects=follow_redirects) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\httpx_client.py", line 914, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\httpx_client.py", line 942, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\httpx_client.py", line 979, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\httpx_client.py", line 1015, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\httpx_transports\default.py", line 232, in handle_request with map_httpcore_exceptions(): File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\contextlib.py", line 158, in exit__ self.gen.throw(typ, value, traceback) File "C:\Users\sean2092\miniconda3\envs\ComfyUI\Lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc


Is there a way to turn on relative DEBUG info in the config node for ollama?

thanks

shhlife commented 3 days ago

Thanks for the comment! It's probably having issues because it's checking a different url for the models. In the next update I'll expose that as a parameter so you can set the url and port you require :)

shhlife commented 1 day ago

I've just pushed a new update that allows for custom url & port configuration - can you give it a try?

image

aicoder2048 commented 12 hours ago

Hi Shhlife, I just tried out your new update on "Griptape Agent Config: Ollama", and it works very well now. Thanks!

The issue I reported on the node could be closed with this new update!

thanks again, Sean

shhlife commented 12 hours ago

Fantastic - really glad to hear it! :) Thanks!