Closed lenaxia closed 5 months ago
Hi @lenaxia , thanks for creating this issue. We're indeed working on this ability, and we'll update here once it's released. Thanks again for filing this issue!
Hi @lenaxia,
Just updating that we've released the support for other vendors using LiteLLM :) We've added instructions on how to setup that in the quickstart guide.
I'm closing this issue for now. You're free to try it 😊 If you have any problems, feel free to open another issue & write to us on Slack.
Description: Being able to use other endpoints will greatly help with adoption and flexibility on the users end. OpenAI compatible endpoints are abundant in both self hosted and enterprise solution. Just being able to define such an endpoint would allow usage of services like AWS Bedrock (via LiteLLM), and LocalAI.
Ask:
OPENAI_ENDPOINT
or some similar configurable environment variable along the API token.