-
Align SDK with how others in this space are structuring their SDKs.
See BrainTrustData for an example:
```
from openai import OpenAI
import os
import time
client = OpenAI(
base_url="https://bra…
-
### Bug Description
When using a proxy like LiteLLM, the model names may not be exactly what the code is expecting as the standard OpenAI model names. The validation code is not configurable and as…
-
**Is your feature request related to a problem? Please describe.**
Feature request, disabling OpenAI API's disables all, I currently have several (OpenAI, Bedrock Proxy & Pipelines)
**Describe the…
-
**Is your feature request related to a problem? Please describe.**
Currently this integration does not support Azure OpenAI which is works slightly differently to a standard OpenAI custom server.
…
-
### Search before asking
- [X] I had searched in the [issues](https://github.com/eosphoros-ai/DB-GPT/issues?q=is%3Aissue) and found no similar feature requirement.
### Description
嵌入模型和总结模型都是opena…
-
### 📦 Deployment Method
Other
### 📌 Version
no
### 💻 Operating System
Windows
### 📌 System Version
10
### 🌐 Browser
Chrome
### 📌 Browser Version
Chrome 126.0.6478.182
### 🐛 Bug Descrip…
-
### What happened?
My litellm config
```
- model_name: "*"
litellm_params:
model: openai/*
api_key: xxxxxx
weight: 1
model_info:
metadata: xxxxxx
- model_…
-
**Describe the feature you'd like**
Currently, I dont see support for connecting to Azure OpenAI services which is behind the Proxy layer. This is a limitation because many companies prefer this setu…
-
Hello all! Was curious if anyone has used FastChat as a proxy around OpenAI/Azure endpoints.
Use case:
We have multiple nodes all making calls to either Azure or OpenAI chat endpoints. We have t…
-
I am using AzureOpenAIEncoder in a closed network which can only access the openai resources using httpx proxy. In the AzureOpenAIEncoder class, there is no way to provide that http_client.
Exampl…