Closed lognat0704 closed 2 months ago
Hi, do you have models on ollama? Because I am developing on Linux and was using all day without any issues. I will try with the new ollama version maybe something changed.
Besides cmd ollama run and run it, do I miss out some steps? Maybe I should download the weights and manually copy to some place?? Btw,my linux machine is google VM.
Thanks in advance.
I am working on enabling virtual machines when is not on the same machine, Is your ComfyUI in the same VM machine?
It's google VM instances. And I run the ollama server and the comfyui server in the same google VM machine.
It's google VM instances. And I run the ollama server and the comfyui server in the same google VM machine. I think they should work now. could you please try again you might need to enter the ip of the server on the base_ip,
if not you should try set ollama host on the VM
sudo systemctl edit ollama.service
add this to the file and save it
[Service]
Environment="OLLAMA_HOST=0.0.0.0"
sudo systemctl daemon-reload
sudo systemctl restart ollama
Try recreate node
@rogeriodeoliveira Thank you , He posted this before the fix I think it should work now does it work for you? I am trying to close the issue, I teste it on my WSL and it works but will try it later on a GCP instance. Thanks
Hi, thanks for the amazong tool. I try to use the example workflow on Linux, but It looks like even I run up the ollama server and the ip as same as "IFprompt to prompt" node in Comfui. And my platform is Linux and not Windows. How do I fix the problem?