xlinx / sd-webui-decadetw-auto-prompt-llm

sd-webui-auto-prompt-llm
MIT License
53 stars 8 forks source link

Problem in LLM-text with Ollama #31

Open marc2608 opened 1 month ago

marc2608 commented 1 month ago

Hello, I can't get llm-text to work with ollama. Can I have some explanations on how to configure the setup exactly, for example about the API key etc... Thank you in advance. I have Ollama running on my PC while A1111 is running, with llama 3.1 engaged, the civitai meta grabber works fine, llm-text also works with the setup configured for openai with the API key, but I can't get it to work with Ollama... In the llm answer window I still get this message: [Auto-LLM][Result][Missing LLM-Text]'choices' Thank you very much in advance

LadyFlames commented 4 weeks ago

thats due to the LocalHost you set from the setting in the webui itself it has to be the same like just set it as https:// LocalHost 1234/v1 or use whatever you have it personally set dont add anything else behind the v1 otherwise you will get errors&Warnings like dont add /Chat/Completions to the end of it just keep it as the https://localhost:(Number)/V1 just make sure the LocalHost is not the same as the webui port then everything should work just fine but still leave it to just https://localhost:(Number)/V1 or what ever you have it set to without adding anything else behind the V1

marc2608 commented 4 weeks ago

Ok Thank you. could you say me what I should change in this config, and where? Capture d'écran 2024-10-25 170558

LadyFlames commented 4 weeks ago

you have it mostly Correct but in ollama itself you have to change the port to 11434 that should make it work as intended but for the webui port itself if you already havent id suggest using A1111 through Stability matrix instead it makes changing settings much easier the webui port itself cannot be 11434

xlinx commented 3 weeks ago

ollama run llama3.2

PS. a sheep on sys tray, that is not load a model. u need load model by command. change default llama3.1 to llama3.2 (depence what model u load), same as @LadyFlames say.

image

marc2608 commented 3 weeks ago

Thanks to your advice my problem is solved, it works perfectly, many thanks to @LadyFlames and @xlinx!

LadyFlames commented 3 weeks ago

anytime