Closed sammcj closed 12 months ago
We need a way to set the OpenAI API Endpoint URL both in the examples and helper libraries.
This is usually refered to as the "BASE_URL" in other languages and defaults to
https://api.openai.com/v1
.This needs to be able to be changed to run against OpenAI compatible APIs with different endpoints, for example I might want to change it to query
https://openai-proxy.local/v1
.
Hello, if you change the URL as you desire, will the format of the POST request change with the URL?
No, other than the base URL, all other paths on the API are the same. There are several projects that make use of 100% compatible open AI APIs - all you need to do is change the base, your real to your servers address
No, other than the base URL, all other paths on the API are the same. There are several projects that make use of 100% compatible open AI APIs - all you need to do is change the base, your real to your servers address
This is a great idea, can you provide links to the projects you mentioned?
Yep sure thing, most of the popular LLM servers offer it now.
There’s also nice integration libraries like LiteLLM which basically means any project that uses it automatically gets an openAI compatible API: https://github.com/BerriAI/litellm
The irony that all of this has to happen because openAI isn’t in the slightest bit open is not lost on me.
Yep sure thing, most of the popular LLM servers offer it now.
- https://github.com/oobabooga/text-generation-webui/tree/main/extensions/openai
- https://localai.io/features/openai-functions/ / https://localai.io/howtos/easy-request-openai/
- https://lmstudio.ai/
There’s also nice integration libraries like LiteLLM which basically means any project that uses it automatically gets an openAI compatible API: https://github.com/BerriAI/litellm
The irony that all of this has to happen because openAI isn’t in the slightest bit open is not lost on me.
Okay, thank you very much for your ideas. We will reference the projects you provided to evaluate how to expand the URL to meet your needs.
Fantastic! Thank you so much.
I had a hack on the upstream espressif package that provides openAI, I suspect the change might be something like this - https://github.com/espressif/esp-iot-solution/compare/master...sammcj:esp-iot-solution:master
I had a hack on the upstream espressif package that provides openAI, I suspect the change might be something like this - espressif/esp-iot-solution@master...sammcj:esp-iot-solution:master
Certainly, we have seen the modifications you made, and they look appropriate. We will further evaluate, and if there are no issues, we will implement your suggestions. Thank you very much for your advice.
We need a way to set the OpenAI API Endpoint URL both in the examples and helper libraries.
This is usually refered to as the "BASE_URL" in other languages and defaults to
https://api.openai.com/v1
.This needs to be able to be changed to run against OpenAI compatible APIs with different endpoints, for example I might want to change it to query
https://openai-proxy.local/v1
.