langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
35.59k stars 4.8k forks source link

Settings/Model Provider/Ollama Model status is always green even if the model is removed on server #5648

Open Ewen2015 opened 2 days ago

Ewen2015 commented 2 days ago

Self Checks

Dify version

0.6.11

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

  1. On the model host server.

ollama rm llama

  1. In Dify.

Settings Model Provider Models Ollama

Model status is always green even if the model is removed from the server.

✔️ Expected Behavior

Model status turns to red when a model is removed.

❌ Actual Behavior

Model status is always green even if the model is removed from the server.

dosubot[bot] commented 2 days ago

Hello, @Ewen2015! I'm here to assist you with any bugs, questions, or contributions you may have. Let's work together to tackle the issue you've encountered.

The issue you're experiencing with the model status remaining green even after the model is removed from the server might be due to a caching mechanism in place. The LBModelManager class uses Redis to manage load balancing configurations and their cooldown states, which could cause the model status to remain green if the cache is not properly invalidated when the model is removed [1].

Additionally, the ModelProviderService class handles model management, including enabling and disabling models. When a model is removed, the remove_model_credentials method is called, which should ideally invalidate the cache. However, if the cache is not properly invalidated, the model status might remain green [2].

To address this issue, you may need to ensure that the cache is properly invalidated when a model is removed. This can be done by using the delete method in the ProviderCredentialsCache class to delete cached model provider credentials [3].

To continue talking to Dosu, mention @dosu.