Closed CoUIfy closed 2 weeks ago
Right you need to set base_ip to the IP address of the computer you running ollama normally is 127.0.0.1
And set port to 11434
I think then restart ComfyUI before restarting open the terminal or command prompt CMD and type ollama list
this should show the models you have installed. Make sure you are not blocking 127.0.0.1 on your firewall or VPN okay start ComfyUI
It freaking worked! Before restarting after using those values you gave me I had to run it once otherwise the values will not be saved. So the second time I had to retape them, Run it and it just worked! :) Thank you!
Since I have you here @if-ai , I saw that you recommanded other models (in the repo), can I install them the same way I do with the official supported models on ollama? Like "ollama run x"?
I mean how to install Proteus-RunDiffusion? I did this: ollama run Proteus-RunDiffusion pulling manifest Error: pull model manifest: file does not exist
Sorry for the noob question
Proteus is an Stable diffusion model that has better prompt adherence you can also use Cascade and pixart sigma models those are all Stable Diffusion there are a few models I recommend for text that are not available for ollama but I didn't mentioned them on the readme https://huggingface.co/dataautogpt3/ProteusV0.4
Hello I have no model showing on my node, despite havign installed ollama and the prompt generator and if ai tools on node managers. And I installed Laava and nous hermes. Whats the problem , why would my node "if prompt to prompt" has no value in the parameter "base ip"?