stavsap / comfyui-ollama

Apache License 2.0
374 stars 34 forks source link

i got this issues, how to deal this #33

Closed zmczmc123654 closed 2 months ago

zmczmc123654 commented 3 months ago

Error fetching Ollama models: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000002968AE3C490>: Failed to establish a new connection: [WinError 10061] 由于目标计算机积极拒绝,无法连接。')) Error fetching Ollama models: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000002968AE3CF10>: Failed to establish a new connection: [WinError 10061] 由于目标计算机积极拒绝,无法连接。')) [comfy_mtb] | INFO -> Found multiple match, we will pick the last D:/SDAI/sd-webui-aki-v4.8\models/SwinIR ['D:\SDAI\ComfyUI-aki-v1.3\models\upscale_models', 'D:/SDAI/sd-webui-aki-v4.8\models/ESRGAN', 'D:/SDAI/sd-webui-aki-v4.8\models/RealESRGAN', 'D:/SDAI/sd-webui-aki-v4.8\models/SwinIR'] D:\SDAI\ComfyUI-aki-v1.3\custom_nodes\comfyui-mixlab-nodes\webApp\lib/juxtapose.css D:\SDAI\ComfyUI-aki-v1.3\custom_nodes\comfyui-mixlab-nodes\webApp\lib/juxtapose.min.js Error fetching Ollama models: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000002968A8522C0>: Failed to establish a new connection: [WinError 10061] 由于目标计算机积极拒绝,无法连接。')) Error fetching Ollama models: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000002968A68BF10>: Failed to establish a new connection: [WinError 10061] 由于目标计算机积极拒绝,无法连接。'))

JimWang151 commented 2 months ago

Does your llava runing?

stavsap commented 2 months ago

is your ollama server running?

if you go in the browser to HTTP://localhost:11434 do you have a response?

JimWang151 commented 2 months ago

is your ollama server running?

if you go in the browser to HTTP://localhost:11434 do you have a response?

yes,runing.

JimWang151 commented 2 months ago

I am now solved this issue. There is no problem with the code. Shut down your proxy or add rule for url localhost/127.0.0.1 .

This issue can be closed.