langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
45.31k stars 6.37k forks source link

got-4o-mini not available in a list of OpenAI models - docker compose installation #6507

Closed amatiytsiv closed 1 month ago

amatiytsiv commented 1 month ago

Self Checks

Dify version

0.6.14

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Clean VM alpine linux clean install of dify 0.6.14 using docker compose using OpenAI API key. - hit show models

✔️ Expected Behavior

All models are displayed

❌ Actual Behavior

All models except got-4o-mini are displayed

amatiytsiv commented 1 month ago
image
dosubot[bot] commented 1 month ago

Hello, @amatiytsiv! I'm here to help you with any bugs, questions, or contributions you might have. I'm Dosu, a bot designed to assist while you wait for a human maintainer.

The issue you're encountering with the "gpt-4o-mini" model not being displayed is likely due to the model name being incorrect or not supported in the current version of Dify. Ensure you are using one of the valid model names supported by Dify. Here is a list of valid model names:

- openai/gpt-4o
- openai/gpt-4
- openai/gpt-4-32k
- openai/gpt-3.5-turbo
- anthropic/claude-3.5-sonnet
- anthropic/claude-3-haiku
- anthropic/claude-3-opus
- anthropic/claude-3-sonnet
- google/gemini-pro-1.5
- google/gemini-flash-1.5
- google/gemini-pro
- cohere/command-r-plus
- cohere/command-r
- meta-llama/llama-3-70b-instruct
- meta-llama/llama-3-8b-instruct
- mistralai/mixtral-8x22b-instruct
- mistralai/mixtral-8x7b-instruct
- mistralai/mistral-7b-instruct
- qwen/qwen-2-72b-instruct
- deepseek/deepseek-chat
- deepseek/deepseek-coder

Make sure to use one of these valid model names instead of gpt-4o-mini when configuring your model settings [1].

Additionally, the support for the "gpt-4o-mini" model has been added and will be available in the latest version. You can refer to the pull request #6442 for more details [2].

To continue talking to Dosu, mention @dosu.

amatiytsiv commented 1 month ago

@dosu in clean install based on previous commits it expected to be in the list once you added api key, no custom adding of some models has to be executed. so that's a bug on clean setup

dosubot[bot] commented 1 month ago

The "gpt-4o-mini" model is not listed among the available models after a clean install of Dify version 0.6.14 because it is not mentioned as one of the supported models in the provided documentation. The supported models include OpenAI, Azure OpenAI, Anthropic, and open-source LLMs hosted on platforms like Hugging Face, Replicate, and xinference, but there is no mention of a "gpt-4o-mini" model [1][2].

If the "gpt-4o-mini" model was expected to be included based on previous commits, it might be a regression or an oversight in the current release. You can refer to the pull request #6442 for more details on the support for this model.

To continue talking to Dosu, mention @dosu.

crazywoola commented 1 month ago

It's already added in this PR, https://github.com/langgenius/dify/pull/6442 it's available in Cloud version, and it will be released in v0.6.15.