gokayfem / ComfyUI_VLM_nodes

Custom ComfyUI nodes for Vision Language Models, Large Language Models, Image to Music, Text to Music, Consistent and Random Creative Prompt Generation
Apache License 2.0
297 stars 23 forks source link

Add ability to perform OpenAI API compatible calls? #96

Open LankyPoet opened 1 week ago

LankyPoet commented 1 week ago

Hi, I would love if instead of relying on llama cpp python we could use any backend of our choosing. For instance LM Studio using API at http://localhost:1234/v1/chat/completions . Can that be added as an option instead of having to load/unload models within ComfyUI itself? Reference Plush nodes for one implementation of this.

Thank you!

gokayfem commented 1 week ago

Thanks for the suggestion, i thought about this. Actually before llama cpp python, i was planning to do this with lm studio, but you need to install another program etc... than i choose llama cpp python. But yes i can add it to my nodes. I actually have OpenAi nodes, it already accepts custom url(i use it with deepseek api same way). All I need to do is add custom url to UI. I might add vision capability to that also.

LankyPoet commented 1 week ago

Sounds amazing, thank you!