Closed Silversith closed 2 months ago
Hi, I am working on a fix for this I am sorry I will try to add it as soon as I can, Thank you for letting me know
I think I fixed this on the last update, please if you could check and close the issue later, thank you.
Hello I have no model showing on my node, despite havign installed ollama and the prompt generator and if ai tools on node managers. And I installed Laava and nous hermes. Whats the problem , why would my node "if prompt to prompt" has no value in the parameter "base ip"?
I answered on your other thread
For some reason I noticed that it's not loading the Ollama models when Ollama is on a different server than the ComfyUI url is currently open on. I'm guessing it's querying localhost, even though when I force the model names using a "String List to Combo" node it works perfectly based on the IP Address I put into the node.