xXAdonesXx / NodeGPT

ComfyUI Extension Nodes for Automated Text Generation.
GNU Affero General Public License v3.0
338 stars 24 forks source link

Cannot connect Local LLM to Ollama Node #23

Open mustafakucuk0 opened 12 months ago

mustafakucuk0 commented 12 months ago

Hello,

Firstly, thank you for this repo. When I try to connect the Ollama node to the Mistral7B model which is locally served by Ollama serve. I am getting this error again and again. Is using the LM Studio the only solution for utilizing NodeGPT in ComfyUI? If so, how can I achieve serving the models on Ubuntu Server that have no GUI?

image

shiloh92 commented 11 months ago

Not a good sign no response on this one.