lobehub / lobe-chat

🤯 Lobe Chat - an open-source, modern-design AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Azure / DeepSeek), Knowledge Base (file upload / knowledge management / RAG ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT/ Claude application.
https://chat-preview.lobehub.com
Other
40.56k stars 9.22k forks source link

[Bug] Local custom OpenAI API configuration is invalid #3446

Open wangxiaodong1021 opened 1 month ago

wangxiaodong1021 commented 1 month ago

📦 Environment

Zeabur

📌 Version

v1.9.6

💻 Operating System

macOS

🌐 Browser

Edge

🐛 Bug Description

I configured a OpenAI API proxy services on the server side, which restricts access to certain models.

Then, I customized the OpenAI API configuration locally, using another set of API keys, but it did not take effect in the latest version.

The previous version prioritized the local configuration.

📷 Recurrence Steps

No response

🚦 Expected Behavior

No response

📝 Additional Information

No response

lobehubbot commented 1 month ago

👀 @wangxiaodong1021

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

qisumi commented 2 weeks ago

I meet the similar problem, and the error is like:

{
  "error": {},
  "endpoint": "https://api.ba****pt.com/v1",
  "provider": "openai"
}

When using server request mode, it can not get the model list.