Closed HyperUpscale closed 1 month ago
did you tried with 127.0.0.1 as your base ip? perhaps you are using a vpn that is blocking the localhost otherwise should work as long as you have ollama running and no other app using the same port 11434
Ok... thats a great suggestion
But let me try to explain again - other Ollama tools are working... ONLY IF tool is not. that means - there is no problem with IP, firewall, or base URL...
Do you want to try to advice me how can I check log file or something to find the problem... I want to skip the back and forth plain suggestion - I don't want to lose your time. I want to find by myself.
Alright... I kinda fixed it Manually. Which Proves there is something wrong with the node... At least for me.
So manual set of the widget value fixes it :)
"widgets_values": [
"a flower",
"127.0.0.1",
"11434",
"ollama",
**"impactframes/llama3_ifai_sd_prompt_mkr_q4km:latest",**
"IF_PromptMKR",
"",
And WHEN I click the arrow on the Node to swap the model... it goes to undefined again ... which causes the json in line for the model to become NULL...
Just for your information. I will adjust manually the json, but something is not OK :)
Greetings.
Just a note: The other Node that I tried (not IF) also doesn't query the models... But it is strange that when I Ques the prompt, then they got queried. Maybe it is normal... I don't know
WHEN I Set manual Wrong value , that is not Null I see this error:
This is the correct reply form "/api/tags" query (the lint of the available models)
BTW .. maybe I am doing something wrong... As I just found out that I have the same issue with my stable diffusion ckpt images... which has nothing to do with IF, nor Ollama, nor IP address ... they are local files.
It is possible the problem to be from StableSwarmUI, which I am using...
I see also the same case error for my new workflow, trying to use Checkpoint loader...
So I think we can close this one.
Sorry for the issue.
I am looking for information, but there is none about this...
Even if I take ready working Workflow - with exact same setup: Windows Ollama (WSL or native) - even the same local addresses, ports, same models ...
I tired different nodes that works with my Ollama and they all work without a problem.
I tried the 3 of the ImpactFrames LLM related, but none of them seem to get the Ollama models - the field is not active:
Any Ideas how to troubleshoot this?