Closed allseeworld closed 2 months ago
Hello @allseeworld! I'm Dosu, a bot here to help you with bug fixes, questions, and contributions. I'm here to assist you while you wait for a human maintainer.
The error "Base model gpt-4o-mini not found" indicates that the model name you are trying to add is incorrect or not supported. Ensure you are using one of the valid model names supported by Dify. Here is a list of valid model names:
- openai/gpt-4o
- openai/gpt-4
- openai/gpt-4-32k
- openai/gpt-3.5-turbo
- anthropic/claude-3.5-sonnet
- anthropic/claude-3-haiku
- anthropic/claude-3-opus
- anthropic/claude-3-sonnet
- google/gemini-pro-1.5
- google/gemini-flash-1.5
- google/gemini-pro
- cohere/command-r-plus
- cohere/command-r
- meta-llama/llama-3-70b-instruct
- meta-llama/llama-3-8b-instruct
- mistralai/mixtral-8x22b-instruct
- mistralai/mixtral-8x7b-instruct
- mistralai/mistral-7b-instruct
- qwen/qwen-2-72b-instruct
- deepseek/deepseek-chat
- deepseek/deepseek-coder
Make sure to use one of these valid model names instead of gpt-4o-mini
when configuring your model settings [1][2].
To continue talking to Dosu, mention @dosu.
it will be supported in latest version.
Ok, thank you very much
Self Checks
Dify version
0.6.12
Cloud or Self Hosted
Self Hosted (Source)
Steps to reproduce
Base model gpt-4o-mini not found
✔️ Expected Behavior
No response
❌ Actual Behavior
No response