-
### Is your feature request related to a problem? Please describe.
Using o1-Mini with OI - because of no vision and token limits - this means we cant really use the latest and best LLM
### Describe …
-
LiteLLM started supporting OpenAI o1 models from version 1.44.276 onwards: https://github.com/BerriAI/litellm/releases/tag/v1.44.27
shatu updated
1 month ago
-
### What happened?
When using the Azure OpenAI API with the `o1-preview` and `o1-mini` models, an error occurs due to the use of unsupported parameter `stream`.
Here is the my azure openai endp…
-
When using o1-mini, the model doesn't stream its output, causing the frontend to show a 524 cloudflare error when the response hangs.
**Desktop**
- OS: Windows
- Browser: Firefox
- ChatHub V…
-
This just got released today. o1-mini
-
### Check for existing issues
- [X] Completed
### Describe the feature
https://openai.com/index/introducing-openai-o1-preview/
OpenAI recently released a new set of models that supposedly reason…
-
### Check for existing issues
- [X] Completed
### Describe the feature
Add new models to the Zed Assistant.
`o1` "thinks" longer on prompts before answering, which would be nice within t…
-
Test using `o1-preview` for runtime lead analysis, with a fallback to 4o. Or, to Sonnet 3.5 with structured output to reduce provider dependency.
-
I'm getting the following error when running O1 series models, it seems that some of the utility calls cannot accept the new argument 'max_completion_tokens'? I could also be mistaken, not exactly sur…
-
**Description:**
Create a new `openai()` function to eventually supersede the existing `chatgpt()` function, adding support for the newer structured output instead of just simple json-mode and adequa…