-
If anyone knows anyone who is tier 5 in openai (@royerloic maybe ?), they could benchmark the new o1 model. I am just tier3 and have to wait...
https://x.com/OpenAI/status/1834278218888872042
ht…
-
除了 o1-mini与o1-preview ,其他模型接入都正常
![image](https://github.com/user-attachments/assets/aa6ff92b-f8eb-492d-86d2-82e48503e1b1)
-
immediate thought on reading the python file, lol, i'll see if i can shoot a pr for it later
-
o1模型的API稍微有些区别。没有system assist,Temperature,以及max_completion_tokens字段的变更
希望作者尽快适配
-
some of the models are seemingly unable to function using openrouter as they send blank responses
affected models:
OpenAI: o1-mini (2024-09-12)
OpenAI: o1-mini
OpenAI: o1-preview (2024-09-12)
Ope…
-
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**…
-
### Feature Description
Add azure prompt caching metadata
https://learn.microsoft.com/en-us/answers/questions/2085985/prompt-caching-in-azure-openai
This forum post says Azure has added promp…
-
### Description
The o1 models do not appear to be supported for Azure, which launched API access for them this month. As with OpenAI, they changed `max_tokens` to `max_completion_tokens` for the o1…
-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a sim…
-
When initiating text generation using o1-preview or o1-preview-2024-09-12, it does not generate any text, and instead the error
"TG Error: Maximum call stack size exceeded"
is displayed.