Closed bincooo closed 7 months ago
Hey @bincooo, great to see you diving into the world of AI with Dify again! Running into a bit of a puzzle, I see? Let's tackle this together.
To address the "Base model coze not found" error when adding a custom model in Dify v0.6.1 with a self-hosted Docker setup, ensure the following:
Proxy Configuration: Confirm your proxy correctly routes requests to OpenAI's API and lists your custom model at the /v1/models
endpoint. This step is crucial for Dify to recognize your model.
API Base URL Update: In api/config.py
, ensure HOSTED_OPENAI_API_BASE
points to your proxy. If necessary, directly set it in the code:
self.HOSTED_OPENAI_API_BASE = 'https://your-proxy-address.com'
Custom Model in Environment Variables: If your model is a paid one, add it to HOSTED_OPENAI_PAID_MODELS
:
HOSTED_OPENAI_PAID_MODELS=gpt-4,gpt-4-turbo-preview,custom-coze
Restart Dify: After updates, restart Dify to apply changes.
Debugging: If issues persist, check proxy logs for errors related to API requests.
For detailed configurations, refer to the api/config.py
and openai_api_compatible.yaml
files in the Dify codebase. These documents outline how to configure Dify for custom OpenAI models, including API keys, endpoint URLs, and model parameters.
在docker-compose.yaml中配置HOSTED_OPENAI_API_BASE
是否有效?
? 咋关了
? 咋关了
用英文,你提交issue的时候模板里面已经写了"I confirm that I am using English to submit this report",非英文的issue会直接关闭。
可以直接修改源码里面 OpenAI client 初始化的地方跳过证书校验,可能是代理的证书生成的有问题
Self Checks
Dify version
0.6.1
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
RT
API Base
填写了中转地址,并且该中转地址的/v1/models
也有自定义model的返回; 但是在添加模型时报错:Base model coze not found
✔️ Expected Behavior
No response
❌ Actual Behavior
No response