if-ai / ComfyUI-IF_AI_tools

ComfyUI-IF_AI_tools is a set of custom nodes for ComfyUI that allows you to generate prompts using a local Large Language Model (LLM) via Ollama. This tool enables you to enhance your image generation workflow by leveraging the power of language models.
https://ko-fi.com/impactframes
366 stars 27 forks source link

IMPORT FAILS COMFYUI #27

Closed aifuzz59 closed 1 month ago

aifuzz59 commented 1 month ago

import fails with this error

File "D:\ComfyUI_Training\ComfyUI_windows_portable_nvidia_cu121_or_cpu (4)\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-IF_AI_tools\IFPromptMkrNode.py", line 5, in import anthropic ModuleNotFoundError: No module named 'anthropic'

Cannot import D:\ComfyUI_Training\ComfyUI_windows_portable_nvidia_cu121_or_cpu (4)\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-IF_AI_tools module for custom nodes: No module named 'anthropic'

if-ai commented 1 month ago

pip install anthropic on the activated comfy ui environment or you can also install the missing pip package from manager 'anthropic'

BO1103 commented 1 month ago

Error occurred when executing OllamaVision:

[WinError 10061] 由于目标计算机积极拒绝,无法连接。

File "E:\comfyui\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "E:\comfyui\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "E:\comfyui\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) File "E:\comfyui\ComfyUI\custom_nodes\comfyui-ollama\CompfyuiOllama.py", line 64, in ollama_vision response = client.generate(model=model, prompt=query, images=images_b64) File "E:\comfyui\ComfyUI\python\lib\site-packages\ollama_client.py", line 126, in generate return self._request_stream( File "E:\comfyui\ComfyUI\python\lib\site-packages\ollama_client.py", line 97, in _request_stream return self._stream(*args, *kwargs) if stream else self._request(args, kwargs).json() File "E:\comfyui\ComfyUI\python\lib\site-packages\ollama_client.py", line 68, in _request response = self._client.request(method, url, **kwargs) File "E:\comfyui\ComfyUI\python\lib\site-packages\httpx_client.py", line 827, in request return self.send(request, auth=auth, follow_redirects=follow_redirects) File "", line 84, in send File "E:\comfyui\ComfyUI\python\lib\site-packages\httpx_client.py", line 914, in send response = self._send_handling_auth( File "E:\comfyui\ComfyUI\python\lib\site-packages\httpx_client.py", line 942, in _send_handling_auth response = self._send_handling_redirects( File "E:\comfyui\ComfyUI\python\lib\site-packages\httpx_client.py", line 979, in _send_handling_redirects response = self._send_single_request(request) File "E:\comfyui\ComfyUI\python\lib\site-packages\httpx_client.py", line 1015, in _send_single_request response = transport.handle_request(request) File "E:\comfyui\ComfyUI\python\lib\site-packages\httpx_transports\default.py", line 232, in handle_request with map_httpcore_exceptions(): File "E:\comfyui\ComfyUI\python\lib\contextlib.py", line 153, in exit self.gen.throw(typ, value, traceback) File "E:\comfyui\ComfyUI\python\lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc

if-ai commented 1 month ago

That's not my node E:\comfyui\ComfyUI\custom_nodes\comfyui-ollama\CompfyuiOllama.py

You should take this to that repo my node works directly with ollama and is not using client like that repo