xlinx / ComfyUI-decadetw-auto-prompt-llm

ComfyUI extension. Auto prompt using LLM and LLM-Vision
MIT License
16 stars 4 forks source link

How do i use ollama instead lmstudio? #5

Open scaruslooner opened 2 months ago

scaruslooner commented 2 months ago

image

xlinx commented 2 months ago

You need install LM Studio or ollama first. LM Studio: Start the LLM service on port 1234. (suggest use this one) ollama: Start service on port 11434 .

Torcelllo commented 1 month ago

I have ollama installed, yet I cannot get it to work in the node

dkqjhoifh commented 3 weeks ago

me too