Closed vibl closed 5 months ago
We could leverage https://github.com/xtekky/gpt4free for that.
I think you can use https://github.com/BerriAI/litellm/tree/main, It shouldn't be soo hard.....(You can call the repo PoLite 🤣🤣🤣🤣)
I made a bare-bones OpenAI compatible API using your library and have been using it locally for a few hours. It currently only mimics the /v1/chat/completions
(no text streaming yet) and /v1/models
endpoints, but I plan on implementing Messages, Threads, and Assistants as Poe messages, chats, and bots respectively.
Would you accept a pull request adding this functionality to the library? (Or a variant, Ã la poe-api-wrapper[openai-proxy]
with an additional Flask dependency) It could add something like poe-openai-server=poe_api_wrapper.openai-server:main
to entry_points
in setup.py
. Or, would you rather I publish this as a separate project (with your library acknowledged, of course)?
I made a bare-bones OpenAI compatible API using your library and have been using it locally for a few hours. It currently only mimics the
/v1/chat/completions
(no text streaming yet) and/v1/models
endpoints, but I plan on implementing Messages, Threads, and Assistants as Poe messages, chats, and bots respectively.Would you accept a pull request adding this functionality to the library? (Or a variant, Ã la
poe-api-wrapper[openai-proxy]
with an additional Flask dependency) It could add something likepoe-openai-server=poe_api_wrapper.openai-server:main
toentry_points
insetup.py
. Or, would you rather I publish this as a separate project (with your library acknowledged, of course)?
Sounds cool! Integrating your OpenAI wrapper into the library is a good idea. I'd be happy to take a look at a pull request with that functionality added. Love to see what you come up with.
Hi @Aws-killer do you use LiteLLM Proxy ? Can we hop on a call to learn how we can make litellm better for you?
Link to my calendar for your convenience
is there a release date?
I'll add this feature in the next release
This is added in the v1.4.8. I'll update image, embedding and files later.
Is there a plan for an OpenAI API compatible reverse proxy?
Most chat UI apps and dataflow frameworks have a connector to the OpenAI API, so a reverse proxy mimicking that API would be a godsend.