Firstly, thank you for this repo. When I try to connect the Ollama node to the Mistral7B model which is locally served by Ollama serve. I am getting this error again and again. Is using the LM Studio the only solution for utilizing NodeGPT in ComfyUI? If so, how can I achieve serving the models on Ubuntu Server that have no GUI?
Hello,
Firstly, thank you for this repo. When I try to connect the Ollama node to the Mistral7B model which is locally served by Ollama serve. I am getting this error again and again. Is using the LM Studio the only solution for utilizing NodeGPT in ComfyUI? If so, how can I achieve serving the models on Ubuntu Server that have no GUI?