xlinx / ComfyUI-decadetw-auto-prompt-llm

ComfyUI extension. Auto prompt using LLM and LLM-Vision
MIT License
9 stars 3 forks source link

How do i use ollama instead lmstudio? #5

Open scaruslooner opened 2 weeks ago

scaruslooner commented 2 weeks ago

image

xlinx commented 1 week ago

You need install LM Studio or ollama first. LM Studio: Start the LLM service on port 1234. (suggest use this one) ollama: Start service on port 11434 .