-
I'm attempting to use an azure-hosted chatgpt model with curateGPT via the litellm proxy system.
I have the proxy running and tested and the LLM package underlying curateGPT successfully using it, …
goodb updated
2 weeks ago
-
## Context
- JS Teams AI Library 1.6.0
- OpenAI module 4.68.2
- Azure Open AI service with no content filter (default configuration)
- GPT 4-o
- default OpenAIModel.azureApiVersion
- no moderator pass…
-
---
name: Feature request
about: Suggest an idea for this project
---
Sure, here's a rewritten version of your GitHub issue:
---
**Implementation Details:**
- A Kernel is created from scratch…
-
### Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the [Continue Discord](https://discord.gg/NWtdYexhMs) for questions
- [X] I'm not able to find an [open issue](ht…
-
Different people have already asked for "their" AI provider to be supported.
It is unlikely that we add support for all of them, but we could switch to a library that allows us to connect to at least…
-
**Describe the bug**
I have a remote LLM that is our internal proxy for Azure OpenAI and Google Gemini. I have configured it properly as it does occasionally work. However, I often get an error pop u…
-
{"hub-mirror": ["ishadows/azure-openai-proxy:latest"]}
-
{"hub-mirror": ["ishadows/azure-openai-proxy:latest"]}
-
I'm using a company-specific proxy to interact with OpenAI. Our proxy requires a different URL structure for sending queries, specifically `http://gpt-proxy.xx.com/gateway/azure/chat/completions`
I…
-
I am using AzureOpenAIEncoder in a closed network which can only access the openai resources using httpx proxy. In the AzureOpenAIEncoder class, there is no way to provide that http_client.
Exampl…