xlinx / ComfyUI-decadetw-auto-prompt-llm

ComfyUI extension. Auto prompt using LLM and LLM-Vision
MIT License
16 stars 4 forks source link

issues with koboldCpp #7

Open DCinAC opened 1 month ago

DCinAC commented 1 month ago

i am using Kcpp because lmstudio does not work with my configs of GPU so i have no choice. I have successfully connectedit to OpenAI compatible completions API but then when I run the ✨ Auto-LLM-Text-Vision node i got the following error on Kcpp's console (meanwhile the comfyUI just wait eternally)

LMstudio對多GPU的電腦不太友好(也暫時沒法特定用哪一個)所以就迫不得已用kcpp了

llm_apiurl = http://localhost:5001/v1/completions


Input: {"model": "", "messages": [{"role": "system", "content": "You are an AI prompt word engineer. Use the provided keywords to create a beautiful composition. Only the prompt words are needed, not your feelings. Customize the style, scene, decoration, etc., and be as detailed as possible without endings."}, {"role": "user", "content": "(prompt censored for privacy)"}], "max_tokens": "80", "temperature": "0.3", "stream": "False"}
'>=' not supported between instances of 'str' and 'int'```

model(if that helps) = Eris_PrimeV4-Vision-32k-7B-GGUF-IQ-Imatrix