-
### Your current environment
I am currently using a T4 instance on Google Colaboratory.
```
Collecting environment information...
PyTorch version: 2.3.0+cu121
Is debug build: False
CUDA used…
-
### The Feature
Impacts the following providers:
- Azure
- OpenAI
- Sagemaker
- Bedrock
### Motivation, pitch
-
### Twitter / LinkedIn details
_No response_
-
Hello,
Currently this AHK code can only work under the official OPENAI API;
Could you please add support for OPENAI service in Microsoft AZURE?
What about making the code work properly under the Az…
lchyn updated
11 months ago
-
### The Feature
some repos use openai.proxy and we can't entirely replace openai for their usage
example
```python
def __init_openai(self, config):
if self.proxy != '':
…
-
I tried the released version or compiled from `main`. Regardless of which example scripts I ran, it will always panic at the same place:
https://github.com/gptscript-ai/gptscript/blob/abdf8d3ba52a9…
-
## Summary
| Status | Count |
|---------------|-------|
| 🔍 Total | 1507 |
| ✅ Successful | 989 |
| ⏳ Timeouts | 3 |
| 🔀 Redirected | 0 |
| 👻 Excluded | 353 |
| ❓ Unknown…
-
### What happened?
when a non openai / azure model is passed in, an async client initialization error happens.
### Relevant log output
```shell
llm-proxy:dev: An error occurred: 'stream_async_…
-
## Summary
| Status | Count |
|---------------|-------|
| 🔍 Total | 1507 |
| ✅ Successful | 990 |
| ⏳ Timeouts | 3 |
| 🔀 Redirected | 0 |
| 👻 Excluded | 353 |
| ❓ Unknown…
-
- [ ] Test Azure OpenAI (v1) Python SDK
- [ ] Add samples
-
### The Feature
Starting this issue to ensure LiteLLM is compatible with OpenAI v1.0.0
## The main goal of this issue:
If a user has OpeAI v1.0.0 their OpenAI calls through litellm should not…