langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
53.24k stars 7.77k forks source link

Adding New model Success But can't List and Select the added model. #11102

Closed AlwaysBluer closed 3 days ago

AlwaysBluer commented 3 days ago

Self Checks

Dify version

0.11.1

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

I'm integrating a new model provider and I have completed the majority of the work. But when I start docker and test, I found it could validate the credentials in a success, but can't store the model information. The module is a custom model integration.

(Ignore the X inference picture please, the uderlying code is for another model provider) image image

this is the console information image

and these are the files added image

the lindormai.yaml is written according to xinference.yaml image image

I'm sure this validate is success since I check it in the backend. So what's my be the reason of my failure? I would appreciate it if you can give me a hand. My

✔️ Expected Behavior

The rerank models should be listed after I added the model

❌ Actual Behavior

no models listed image

dosubot[bot] commented 3 days ago

I found a similar unsolved discussion titled add self deploy llm model, add error which might be related to your issue [1]. Additionally, here are some potential reasons and solutions for your issue:

  1. Model Configuration: Ensure that the model configuration files are correctly set up according to the AI Model Entity schema. Any discrepancies in the configuration might prevent the model from being listed.

  2. Provider YAML File: Verify that the provider YAML file is correctly defined, including the provider's identifier, display name, supported model types, and credential rules. This file is crucial for the integration of new models [2].

  3. Frontend Retrieval: Check if the frontend is correctly retrieving the list of available models. The frontend should access the backend's configuration and credential information to display the models. Ensure that the backend is providing the necessary data for the frontend to list the models [3].

  4. Backend Validation: Since the credentials are validated successfully, ensure that the backend is also correctly storing the model information after validation. There might be an issue in the database operations or the logic that handles the storage of model information.

  5. Logs and Debugging: Review the backend logs for any errors or warnings that might indicate why the model information is not being stored. This can provide insights into any issues during the storage process.

By checking these areas, you might be able to identify the cause of the issue and resolve it.

To continue talking to Dosu, mention @dosu.

AlwaysBluer commented 3 days ago

@dosu, do you know what is the entrypoint to persist the model information in dify?

dosubot[bot] commented 3 days ago

The entry point for persisting model information in Dify is primarily handled in the api/core/model_manager.py file. This file contains classes such as ModelInstance, ModelManager, and LBModelManager, which are responsible for managing model instances, provider configurations, and load balancing configurations. These classes handle the storage and retrieval of model information, including credentials and load balancing configurations [1].

To continue talking to Dosu, mention @dosu.

crazywoola commented 3 days ago

Please fresh the page to see if this persists.

AlwaysBluer commented 3 days ago

I found the problem, it's the missing implementation of function get_customizable_model_schema.