-
OpenAI 的 o1 系列官方 API 是不支持流式输出的,我注意到这一点已经在刚刚被修复了。
但是一些中转平台有些 `o1-mini-all` 等模型,是通过逆向 ChatGPT Web 端得到的,它们是可以流式输出的,但是目前代码的逻辑应该是包含 `o1-` (且提供商为 OpenAI)就固定为非流式。这导致两个问题:
1. 增加自定义提供商时,使用 `o1-mini` 和 `o…
-
Here're some new simple misguided riddles
**I'm tall when I'm young, and I'm taller when I'm old. What am I?**
Definitely not a candle
**I'm tall when I'm young, and I'm taller when I'm old. Wh…
-
### Jan version
v0.5.4-640
### Describe the Bug
When making a request using o1-preview or o1-mini, getting:
Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_complet…
-
### What happened?
When using fake stream with o1 we got this problem
```
stream_options: { "include_usage": true },
stream: true,
```
### Relevant log output
```shell
error: {
message…
-
### 这个bug是否已存在现有issue了?
- [X] 我确认没有已有issue,且已阅读**常见问题**。
### 错误表现
目前使用o1-mini模型回答问题后,前端首先会显示正常的markdown格式,然后会自动刷新一次,回复的内容都会成为一个段落,这一整个段落很难阅读。
![2024-09-18 22 09 09](https://github.com/user-attachm…
-
Uncaught runtime errors:
×
ERROR
Cannot read properties of undefined (reading 'split')
TypeError: Cannot read properties of undefined (reading 'split')
at Proxy.initMap (webpack-interna…
-
Just updated to 0.56.0, now trying to start aider with o1-mini via openrouter I get this message:
aider --model openrouter/openai/o1-mini
────────────────────────────────────────────────────────…
-
I believe it needs to be added to default_models.py and also add some documentation.
I just tried a local patch and the API reported that `o1-preview` does not exist or I don't have access to it, …
-
Waiting for the new o1 model integration!!!!
-
**Describe the bug**
Run pytest for test_agent.py or anything else that uses llama-3.1-70b-versatile through openai_gpt.py and it fails with:
unexpected keyword argument 'max_completion_tokens'
…