if-ai / ComfyUI-IF_AI_tools

ComfyUI-IF_AI_tools is a set of custom nodes for ComfyUI that allows you to generate prompts using a local Large Language Model (LLM) via Ollama. This tool enables you to enhance your image generation workflow by leveraging the power of language models.
https://ko-fi.com/impactframes
366 stars 27 forks source link

IF chat prompt does't get the image #22

Closed blueraincoatli closed 2 months ago

blueraincoatli commented 2 months ago

I load a image to IF chat prompt image input, and ask for describing the picture, but the response is not correct,, it seems that the node does't get the image. 微信截图_20240424200216

if-ai commented 2 months ago

Yes for images you need a multi-modal model LlaMA3 is just for text here is a link for a LLaVa1.6 ollama run llava:7b-v1.6-mistral-q5_0 you can also see all the available models here on this list the bigger they are the better quality you get but also take more resources like vram etc.. https://ollama.com/library/llava if you click on the latest dropdown you can check them up

blueraincoatli commented 2 months ago

It works after I get the llava switch the model in the IF-AI node, thank you very much!

if-ai commented 2 months ago

Thank you, I will close the issue then

Linaghan34 commented 1 month ago

After running the command "ollama run llava" the model is automatically downloaded to the C drive, do I need to move the model to a certain folder, my node does not recognize the model! thanks image image

if-ai commented 1 month ago

no if ollama is running it should see the model if you open your terminal and type ollama list and llava is listed it should be also in comfy is the same command. so if llama service is running comfy should see it

Linaghan34 commented 1 month ago

no if ollama is running it should see the model if you open your terminal and type ollama list and llava is listed it should be also in comfy is the same command. so if llama service is running comfy should see it

thanks,I'm going to reinstall 0.1.25, I had the latest version installed previously

if-ai commented 1 month ago

No the very last version works fine I have been using it they fix that issue

Linaghan34 commented 1 month ago

No the very last version works fine I have been using it they fix that issue

Sorry to interrupt again, I reinstalled 0.1.25 and still can't load ollama's model, and the reply given by claude3 doesn't seem to help! image image

if-ai commented 1 month ago

if you are in the same computer try writing 127.0.0.1 as base ip instead of localhost could you give it a try please

if-ai commented 1 month ago

explorer_Jn3yHmU92p in your username in your computer locate that file open it see any message with source=routes to check if that is your base_ip notepad_gliJHrD8Gi

Linaghan34 commented 1 month ago

if you are in the same computer try writing 127.0.0.1 as base ip instead of localhost could you give it a try please

its worked for me, thanks so much, I found this file and it's exactly what you said image

if-ai commented 1 month ago

nice thanks, glad it works now