alisson-anjos / ComfyUI-Ollama-Describer

A ComfyUI extension that allows you to use some LLM templates provided by Ollama, such as Gemma, Llava (multimodal), Llama2, Llama3 or Mistral
MIT License
53 stars 8 forks source link

Model selection. #5

Closed jdamboeck closed 5 months ago

jdamboeck commented 5 months ago

Is there a reason for pre-selecting possible models instead of just give a selection of installed models?

alisson-anjos commented 5 months ago

Is there a reason for pre-selecting possible models instead of just give a selection of installed models?

Hello, so the idea was to make some models available as soon as the person installs the custom node, because otherwise the list will appear blank, now with the custom_model it would be possible to do the installation by putting a value in that field and then in the model field list the models that are already installed, other than that there is the issue of classifying which models are vision models and which ones do not have this feature, there is no way for me to know other than leaving this information previously set in the code.

jdamboeck commented 5 months ago

Maybe some kind of override would be good? Or a second node? I would really like to use your node but with the models i have installed.

jdamboeck commented 5 months ago

I see you already did it :) Awesome thank you!

alisson-anjos commented 5 months ago

I see you already did it :) Awesome thank you!

So, I added the possibility of selecting a custom_model, but it is still something that you need to type the name of the model, but with that you can already use your models, you just need to pay attention to the issue that for Ollama Image Describer you need to use a model that is multimodal (with vision), I will see a way to extract the information directly from ollama to find out if a model supports vision and list the ones installed on the machine.

jdamboeck commented 5 months ago

Yeah i was just about to write as i saw the update. Nice. Everything there i need. But sure. For some it may be better to know which models they can use.