langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
35.92k stars 4.85k forks source link

Chatflow: Cannot save LLM module prompt with Completion model (Phi3/dockerv0.6.8) #4563

Open jafrank88 opened 1 month ago

jafrank88 commented 1 month ago

Self Checks

Dify version

0.6.8

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Create a chatbot with Knowledge Retrieval and use phi3 [completion] model for the LLM. Even if you type in the Prompt field, it remains showing 0 characters and the prompt is not saved when closing and reopening that module. LLM completion prompt

✔️ Expected Behavior

I was expecting the Prompt field to show the character count and have the prompt be saved when closing the module.

❌ Actual Behavior

I believe the prompt is not saved and the chatflow fails because of that.

dosubot[bot] commented 1 month ago

To resolve the issue where the Prompt field does not save and shows 0 characters when using the Phi3 [completion] model in Dify version 0.6.8, self-hosted on Docker, you need to ensure that the completionPromptConfig is correctly set and updated. Here are the steps to troubleshoot and fix the issue:

  1. Ensure completionPromptConfig is Updated Correctly: The completionPromptConfig must be correctly updated with the new prompt text. Verify that the setCurrentAdvancedPrompt function is setting the prompt field correctly:

    const setCurrentAdvancedPrompt = (prompt: PromptItem | PromptItem[], isUserChanged?: boolean) => {
       if (!isAdvancedPrompt)
           return;
       if (modelModeType === ModelModeType.chat) {
           setChatPromptConfig({
               ...chatPromptConfig,
               prompt: prompt as PromptItem[],
           });
       } else {
           setCompletionPromptConfig({
               ...completionPromptConfig,
               prompt: prompt as PromptItem,
           });
       }
       if (isUserChanged)
           onUserChangedPrompt();
    };
  2. Check migrateToDefaultPrompt Function: Ensure that the migrateToDefaultPrompt function correctly replaces the prompt text. This function is responsible for migrating and setting the default prompt configuration:

    const migrateToDefaultPrompt = async (isMigrateToCompetition?: boolean, toModelModeType?: ModelModeType) => {
       const mode = modelModeType;
       const toReplacePrePrompt = prePrompt || '';
       if (!isAdvancedPrompt) {
           const { chat_prompt_config, completion_prompt_config, stop } = await fetchPromptTemplate({
               appMode,
               mode,
               modelName,
               hasSetDataSet,
           });
           if (modelModeType === ModelModeType.chat) {
               const newPromptConfig = produce(chat_prompt_config, (draft) => {
                   draft.prompt = draft.prompt.map((p) => {
                       return {
                           ...p,
                           text: p.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt),
                       };
                   });
               });
               setChatPromptConfig(newPromptConfig);
           } else {
               const newPromptConfig = produce(completion_prompt_config, (draft) => {
                   draft.prompt.text = draft.prompt.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt);
               });
               setCompletionPromptConfig(newPromptConfig);
               setCompletionParams({
                   ...completionParams,
                   stop,
               });
           }
           return;
       }
    
       if (isMigrateToCompetition) {
           const { completion_prompt_config, chat_prompt_config, stop } = await fetchPromptTemplate({
               appMode,
               mode: toModelModeType as ModelModeType,
               modelName,
               hasSetDataSet,
           });
    
           if (toModelModeType === ModelModeType.completion) {
               const newPromptConfig = produce(completion_prompt_config, (draft) => {
                   if (!completionPromptConfig.prompt?.text)
                       draft.prompt.text = draft.prompt.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt);
                   else
                       draft.prompt.text = completionPromptConfig.prompt?.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt);
                   if (['advanced-chat', 'agent-chat', 'chat'].includes(appMode) && completionPromptConfig.conversation_histories_role.assistant_prefix && completionPromptConfig.conversation_histories_role.user_prefix)
                       draft.conversation_histories_role = completionPromptConfig.conversation_histories_role;
               });
               setCompletionPromptConfig(newPromptConfig);
               if (!completionParams.stop || completionParams.stop.length === 0) {
                   setCompletionParams({
                       ...completionParams,
                       stop,
                   });
               }
               setStop(stop); // switch mode's params is async. It may override the stop value.
           } else {
               const newPromptConfig = produce(chat_prompt_config, (draft) => {
                   draft.prompt = draft.prompt.map((p) => {
                       return {
                           ...p,
                           text: p.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt),
                       };
                   });
               });
               setChatPromptConfig(newPromptConfig);
           }
       }
    };
  3. Check for Error Handling Mechanisms: Ensure that there are no validation errors preventing the prompt from being saved. The Dify codebase includes several error handling mechanisms, such as key validation, duplicate key checks, and prompt content validation. Make sure these checks are not causing the issue.

    if (promptEmpty) {
       notify({ type: 'error', message: t('appDebug.otherError.promptNoBeEmpty'), duration: 3000 });
       return;
    }

By following these steps and ensuring that the completionPromptConfig is correctly set and updated, you should be able to resolve the issue where the Prompt field does not save and shows 0 characters when using the Phi3 [completion] model in Dify version 0.6.8.

#### Details **Sources** - [web/app/components/app/configuration/hooks/use-advanced-prompt-config.ts](https://github.com/langgenius/dify/blob/main/web/app/components/app/configuration/hooks/use-advanced-prompt-config.ts) - [web/types/app.ts](https://github.com/langgenius/dify/blob/main/web/types/app.ts) - [web/models/debug.ts](https://github.com/langgenius/dify/blob/main/web/models/debug.ts) - [web/app/components/app/configuration/index.tsx](https://github.com/langgenius/dify/blob/main/web/app/components/app/configuration/index.tsx) - [api/services/advanced_prompt_template_service.py](https://github.com/langgenius/dify/blob/main/api/services/advanced_prompt_template_service.py) - [en/features/prompt-engineering/prompt-template.md](https://github.com/langgenius/dify-docs/blob/main/en/features/prompt-engineering/prompt-template.md) - [en/features/prompt-engineering/README.md](https://github.com/langgenius/dify-docs/blob/main/en/features/prompt-engineering/README.md) - [web/app/components/base/prompt-editor/constants.tsx](https://github.com/langgenius/dify/blob/main/web/app/components/base/prompt-editor/constants.tsx) - [api/core/prompt/utils/prompt_template_parser.py](https://github.com/langgenius/dify/blob/main/api/core/prompt/utils/prompt_template_parser.py) - [web/app/components/app/configuration/config-var/index.tsx](https://github.com/langgenius/dify/blob/main/web/app/components/app/configuration/config-var/index.tsx) - [web/app/components/app/configuration/config-prompt/simple-prompt-input.tsx](https://github.com/langgenius/dify/blob/main/web/app/components/app/configuration/config-prompt/simple-prompt-input.tsx) - [web/app/components/app/configuration/config/agent/prompt-editor.tsx](https://github.com/langgenius/dify/blob/main/web/app/components/app/configuration/config/agent/prompt-editor.tsx) - [docker/docker-compose.middleware.yaml](https://github.com/langgenius/dify/blob/main/docker/docker-compose.middleware.yaml) - [api/config.py](https://github.com/langgenius/dify/blob/main/api/config.py) - [docker/docker-compose.yaml](https://github.com/langgenius/dify/blob/main/docker/docker-compose.yaml)
**About Dosu** This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

jafrank88 commented 1 month ago

I was able to solve the problem by switching the model to a Chat model and switching it back to a Completion model, but I am not sure that is the ideal solution.