if-ai / ComfyUI-IF_AI_tools

ComfyUI-IF_AI_tools is a set of custom nodes for ComfyUI that allows you to generate prompts using a local Large Language Model (LLM) via Ollama. This tool enables you to enhance your image generation workflow by leveraging the power of language models.
https://ko-fi.com/impactframes
366 stars 27 forks source link

ImpactFrames Can't find ollama models at "selected_model" #38

Closed HyperUpscale closed 1 month ago

HyperUpscale commented 1 month ago

I am looking for information, but there is none about this...

Even if I take ready working Workflow - with exact same setup: Windows Ollama (WSL or native) - even the same local addresses, ports, same models ...

I tired different nodes that works with my Ollama and they all work without a problem.

I tried the 3 of the ImpactFrames LLM related, but none of them seem to get the Ollama models - the field is not active:

image

image

image

Any Ideas how to troubleshoot this?

if-ai commented 1 month ago

did you tried with 127.0.0.1 as your base ip? perhaps you are using a vpn that is blocking the localhost otherwise should work as long as you have ollama running and no other app using the same port 11434 ProtonVPN_HESBgYjzlT

HyperUpscale commented 1 month ago

Ok... thats a great suggestion image

But let me try to explain again - other Ollama tools are working... ONLY IF tool is not. that means - there is no problem with IP, firewall, or base URL...

Do you want to try to advice me how can I check log file or something to find the problem... I want to skip the back and forth plain suggestion - I don't want to lose your time. I want to find by myself.

HyperUpscale commented 1 month ago

Alright... I kinda fixed it Manually. Which Proves there is something wrong with the node... At least for me.

So manual set of the widget value fixes it :)

  "widgets_values": [
    "a flower",
    "127.0.0.1",
    "11434",
    "ollama",
    **"impactframes/llama3_ifai_sd_prompt_mkr_q4km:latest",**
    "IF_PromptMKR",
    "",

image

And WHEN I click the arrow on the Node to swap the model... it goes to undefined again ... which causes the json in line for the model to become NULL...

Just for your information. I will adjust manually the json, but something is not OK :)

Greetings.

HyperUpscale commented 1 month ago

Just a note: The other Node that I tried (not IF) also doesn't query the models... But it is strange that when I Ques the prompt, then they got queried. Maybe it is normal... I don't know

WHEN I Set manual Wrong value , that is not Null I see this error:

step 1

This is the correct reply form "/api/tags" query (the lint of the available models) image

HyperUpscale commented 1 month ago

BTW .. maybe I am doing something wrong... As I just found out that I have the same issue with my stable diffusion ckpt images... which has nothing to do with IF, nor Ollama, nor IP address ... they are local files.

It is possible the problem to be from StableSwarmUI, which I am using...

I see also the same case error for my new workflow, trying to use Checkpoint loader...

image

So I think we can close this one.

Sorry for the issue.