Closed Andoramb closed 12 months ago
Hey there! Thanks for the suggestion.
I'll look into this. My initial thought is that this would take the project in a new direction--which I am very much open to as it evolves. As things currently exist, the node is quite specific to OpenAI's REST API endpoints, request paths, and payload, so we'd be looking at more than simply providing an option for a custom API URL in the node options.
I just opened up discussions for this project. Feel free to start a conversation about your idea over there! Oh, and be sure to update to the latest version of the node if you haven't done so in the past day. I fixed several initial bugs with the OpenAI Assistants requests.
Cheers!
Hi!
Would it be possible to provide custom API url when setting up the node? This could potentially play along when running local llm models, such as text-generation-webui-docker or LM Studio.
Then tokens don't really matter.
Thanks :)