snowby666 / poe-api-wrapper

👾 A Python API wrapper for Poe.com. With this, you will have free access to GPT-4, Claude, Llama, Gemini, Mistral and more! 🚀
https://pypi.org/project/poe-api-wrapper/
GNU General Public License v3.0
962 stars 114 forks source link

[feature] Is there a plan for an OpenAI API compatible reverse proxy? #67

Closed vibl closed 5 months ago

vibl commented 1 year ago

Is there a plan for an OpenAI API compatible reverse proxy?

Most chat UI apps and dataflow frameworks have a connector to the OpenAI API, so a reverse proxy mimicking that API would be a godsend.

vibl commented 1 year ago

We could leverage https://github.com/xtekky/gpt4free for that.

Aws-killer commented 1 year ago

I think you can use https://github.com/BerriAI/litellm/tree/main, It shouldn't be soo hard.....(You can call the repo PoLite 🤣🤣🤣🤣)

Mchar7 commented 11 months ago

I made a bare-bones OpenAI compatible API using your library and have been using it locally for a few hours. It currently only mimics the /v1/chat/completions (no text streaming yet) and /v1/models endpoints, but I plan on implementing Messages, Threads, and Assistants as Poe messages, chats, and bots respectively.

Would you accept a pull request adding this functionality to the library? (Or a variant, à la poe-api-wrapper[openai-proxy] with an additional Flask dependency) It could add something like poe-openai-server=poe_api_wrapper.openai-server:main to entry_points in setup.py. Or, would you rather I publish this as a separate project (with your library acknowledged, of course)?

snowby666 commented 11 months ago

I made a bare-bones OpenAI compatible API using your library and have been using it locally for a few hours. It currently only mimics the /v1/chat/completions (no text streaming yet) and /v1/models endpoints, but I plan on implementing Messages, Threads, and Assistants as Poe messages, chats, and bots respectively.

Would you accept a pull request adding this functionality to the library? (Or a variant, à la poe-api-wrapper[openai-proxy] with an additional Flask dependency) It could add something like poe-openai-server=poe_api_wrapper.openai-server:main to entry_points in setup.py. Or, would you rather I publish this as a separate project (with your library acknowledged, of course)?

Sounds cool! Integrating your OpenAI wrapper into the library is a good idea. I'd be happy to take a look at a pull request with that functionality added. Love to see what you come up with.

ishaan-jaff commented 9 months ago

Hi @Aws-killer do you use LiteLLM Proxy ? Can we hop on a call to learn how we can make litellm better for you?

Link to my calendar for your convenience

comeback01 commented 8 months ago

is there a release date?

snowby666 commented 5 months ago

I'll add this feature in the next release

snowby666 commented 5 months ago

This is added in the v1.4.8. I'll update image, embedding and files later.