-
Enables access to retrieval and code interpreter
https://platform.openai.com/docs/assistants/overview
-
I have my `ENDPOINT` and `API_KEY`variables from Azure OpenAI.
The variables looks like this:
```
API_KEY = "XXXXXX"
ENDPOINT = "https://.openai.azure.com/openai/deployments/gpt-4o/chat/completion…
-
Type: Bug
## Extension Information
- Cody Version: 1.38.3
- VS Code Version: 1.94.2
- Extension Host: desktop
## Steps to Reproduce
1. Ask a question and select OpenAI o1-preview or mini
2.
3.
…
-
## Summary
Support any OpenAI compatible endpoints, such as tabbyAPI, vLLM, ollama, etc.
I am running Qwen2.5-coder 32B with [tabbyAPI](https://github.com/theroyallab/tabbyAPI) which is a OpenAI …
-
I’ve tried to use the `autoevals.Factuality` module with my Azure endpoint. I noticed the `engine` argument in the documentation, so I assumed that it corresponds to the model name for Azure OpenA…
-
via [model listing](https://platform.openai.com/docs/api-reference/models/list) API call
```python
import openai
openai.models.list()
```
-
The current AzureOpenAI auto-configuration code assume that the users will provide a static api key if no key is set, then an exception is throw see https://github.com/spring-projects/spring-ai/blob/c…
-
### Idea
We have been using Sonnet3.5 for search and replace style editing, it does make sense to try and push Predicted Output (which is another way of saying speculative editing) and see how that p…
-
New API Review meeting has been requested.
**Service Name**: Azure OpenAI Service - Azure OpenAI Service
**Review Created By**: Jia Liu
**Review Date**: 12/11/2024 09:00 AM PT
**Release Plan**: [](ht…
-
A lot of organizations that use Azure OpenAI are going to want to use Entra ID for authentication. LiteLLM already supports this:
https://litellm.vercel.app/docs/providers/azure#entrata-id---use-tena…