Open LankyPoet opened 1 week ago
Thanks for the suggestion, i thought about this. Actually before llama cpp python, i was planning to do this with lm studio, but you need to install another program etc... than i choose llama cpp python. But yes i can add it to my nodes. I actually have OpenAi nodes, it already accepts custom url(i use it with deepseek api same way). All I need to do is add custom url to UI. I might add vision capability to that also.
Sounds amazing, thank you!
Hi, I would love if instead of relying on llama cpp python we could use any backend of our choosing. For instance LM Studio using API at http://localhost:1234/v1/chat/completions . Can that be added as an option instead of having to load/unload models within ComfyUI itself? Reference Plush nodes for one implementation of this.
Thank you!