pezzolabs / pezzo

🕹️ Open-source, developer-first LLMOps platform designed to streamline prompt design, version management, instant delivery, collaboration, troubleshooting, observability and more.
https://pezzo.ai
Apache License 2.0
2.55k stars 211 forks source link

Support self-hosted models (non-OpenAI flavor) #283

Open cryoff opened 10 months ago

cryoff commented 10 months ago

Proposal

It would be great to use non-OpenAI APIs. For example, in the real setting I can use Llama2 or even FlanT5, host it somewhere, expose API endpoint, set up pezzo proxy to proxy requests and submit the custom headers. Right now a similar functionality is implemented in LangChain using custom callbacks. That could be a way as well.

Use-Case

No response

Is this a feature you are interested in implementing yourself?

Maybe

developbiao commented 4 months ago

Hello, can this project support other LLM models? For example, Gemini for example(vertex ai) or other third-party projects. Are there any plans to add this feature in the future? Thanks :)