Closed LarkinZero closed 1 week ago
👀 @LarkinZero
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。
@LarkinZero 你好!我是Dosu,我是一个可以帮助您解决错误,回答问题,并成为贡献者的机器人。我会在等待人类维护者时帮助您。
I found some similar issues and discussions that might help resolve your problem:
provider=ollama
[2].For your specific issue with the o1
and o1mini
models, you need to ensure that the payload for these models is pruned using the pruneO1Payload
function. This function sets frequency_penalty
and presence_penalty
to 0, changes the role of messages from 'system' to 'user', disables streaming, and sets temperature
and top_p
to 1. Here is the relevant code snippet:
export const o1Models = new Set([
'o1-preview',
'o1-preview-2024-09-12',
'o1-mini',
'o1-mini-2024-09-12',
]);
export const pruneO1Payload = (payload: ChatStreamPayload) => ({
...payload,
frequency_penalty: 0,
messages: payload.messages.map((message: OpenAIChatMessage) => ({
...message,
role: message.role === 'system' ? 'user' : message.role,
})),
presence_penalty: 0,
stream: false,
temperature: 1,
top_p: 1,
});
export const LobeOpenAI = LobeOpenAICompatibleFactory({
baseURL: 'https://api.openai.com/v1',
chatCompletion: {
handlePayload: (payload) => {
const { model } = payload;
if (o1Models.has(model)) {
return pruneO1Payload(payload) as any;
}
return { ...payload, stream: payload.stream ?? true };
},
},
debug: {
chatCompletion: () => process.env.DEBUG_OPENAI_CHAT_COMPLETION === '1',
},
provider: ModelProvider.OpenAI,
});
Make sure your implementation includes this function to handle the payload correctly for the o1
and o1mini
models [4].
Additionally, ensure that you have correctly set the OPENAI_API_KEY
environment variable with a valid API key. Here is the Docker command to run LobeChat with the correct environment variables:
$ docker run -d -p 3210:3210 \
-e OPENAI_API_KEY=sk-xxxx \
-e OPENAI_PROXY_URL=https://api-proxy.com/v1 \
-e ACCESS_CODE=lobe66 \
--name lobe-chat \
lobehub/lobe-chat
Replace sk-xxxx
with your actual OpenAI API key. If you are using a third-party proxy, ensure that the OPENAI_PROXY_URL
is correctly set to the proxy address provided by the third-party service [5].
To continue talking to Dosu, mention @dosu.
已解决,是 api key 的权限没有更新。需要手动在 api 提供商的后台将权限刷新即可
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
Solved, the permissions of the api key are not updated. You need to manually refresh the permissions in the api provider's backend
✅ @LarkinZero
This issue is closed, If you have any questions, you can comment and reply.\ 此问题已经关闭。如果您有任何问题,可以留言并回复。
📦 部署环境
Docker
📌 软件版本
1.17.2
💻 系统环境
Windows, macOS
🌐 浏览器
Edge
🐛 问题描述
日志
📷 复现步骤
没有修改模型api key 的情况下,通过刷新模型列表,获取到支持了 o1 然后切换到o1 模型后,发起对话,报错提示
OpenAI API Key 不正确或为空,请检查 OpenAI API Key 后重试
重试依旧报错🚦 期望结果
No response
📝 补充信息
No response