vespperhq / vespper

Open-source AI copilot that lets you chat with your observability data and code 🧙‍♂️
https://www.vespper.com/?utm_source=github
Apache License 2.0
293 stars 36 forks source link

Support configurable OpenAI endpoint #5

Closed lenaxia closed 5 months ago

lenaxia commented 5 months ago

Description: Being able to use other endpoints will greatly help with adoption and flexibility on the users end. OpenAI compatible endpoints are abundant in both self hosted and enterprise solution. Just being able to define such an endpoint would allow usage of services like AWS Bedrock (via LiteLLM), and LocalAI.

Ask:

david1542 commented 5 months ago

Hi @lenaxia , thanks for creating this issue. We're indeed working on this ability, and we'll update here once it's released. Thanks again for filing this issue!

david1542 commented 5 months ago

Hi @lenaxia,

Just updating that we've released the support for other vendors using LiteLLM :) We've added instructions on how to setup that in the quickstart guide.

I'm closing this issue for now. You're free to try it 😊 If you have any problems, feel free to open another issue & write to us on Slack.