Open jafrank88 opened 1 month ago
To resolve the issue where the Prompt field does not save and shows 0 characters when using the Phi3 [completion] model in Dify version 0.6.8, self-hosted on Docker, you need to ensure that the completionPromptConfig
is correctly set and updated. Here are the steps to troubleshoot and fix the issue:
Ensure completionPromptConfig
is Updated Correctly:
The completionPromptConfig
must be correctly updated with the new prompt text. Verify that the setCurrentAdvancedPrompt
function is setting the prompt
field correctly:
const setCurrentAdvancedPrompt = (prompt: PromptItem | PromptItem[], isUserChanged?: boolean) => {
if (!isAdvancedPrompt)
return;
if (modelModeType === ModelModeType.chat) {
setChatPromptConfig({
...chatPromptConfig,
prompt: prompt as PromptItem[],
});
} else {
setCompletionPromptConfig({
...completionPromptConfig,
prompt: prompt as PromptItem,
});
}
if (isUserChanged)
onUserChangedPrompt();
};
Check migrateToDefaultPrompt
Function:
Ensure that the migrateToDefaultPrompt
function correctly replaces the prompt text. This function is responsible for migrating and setting the default prompt configuration:
const migrateToDefaultPrompt = async (isMigrateToCompetition?: boolean, toModelModeType?: ModelModeType) => {
const mode = modelModeType;
const toReplacePrePrompt = prePrompt || '';
if (!isAdvancedPrompt) {
const { chat_prompt_config, completion_prompt_config, stop } = await fetchPromptTemplate({
appMode,
mode,
modelName,
hasSetDataSet,
});
if (modelModeType === ModelModeType.chat) {
const newPromptConfig = produce(chat_prompt_config, (draft) => {
draft.prompt = draft.prompt.map((p) => {
return {
...p,
text: p.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt),
};
});
});
setChatPromptConfig(newPromptConfig);
} else {
const newPromptConfig = produce(completion_prompt_config, (draft) => {
draft.prompt.text = draft.prompt.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt);
});
setCompletionPromptConfig(newPromptConfig);
setCompletionParams({
...completionParams,
stop,
});
}
return;
}
if (isMigrateToCompetition) {
const { completion_prompt_config, chat_prompt_config, stop } = await fetchPromptTemplate({
appMode,
mode: toModelModeType as ModelModeType,
modelName,
hasSetDataSet,
});
if (toModelModeType === ModelModeType.completion) {
const newPromptConfig = produce(completion_prompt_config, (draft) => {
if (!completionPromptConfig.prompt?.text)
draft.prompt.text = draft.prompt.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt);
else
draft.prompt.text = completionPromptConfig.prompt?.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt);
if (['advanced-chat', 'agent-chat', 'chat'].includes(appMode) && completionPromptConfig.conversation_histories_role.assistant_prefix && completionPromptConfig.conversation_histories_role.user_prefix)
draft.conversation_histories_role = completionPromptConfig.conversation_histories_role;
});
setCompletionPromptConfig(newPromptConfig);
if (!completionParams.stop || completionParams.stop.length === 0) {
setCompletionParams({
...completionParams,
stop,
});
}
setStop(stop); // switch mode's params is async. It may override the stop value.
} else {
const newPromptConfig = produce(chat_prompt_config, (draft) => {
draft.prompt = draft.prompt.map((p) => {
return {
...p,
text: p.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt),
};
});
});
setChatPromptConfig(newPromptConfig);
}
}
};
Check for Error Handling Mechanisms: Ensure that there are no validation errors preventing the prompt from being saved. The Dify codebase includes several error handling mechanisms, such as key validation, duplicate key checks, and prompt content validation. Make sure these checks are not causing the issue.
if (promptEmpty) {
notify({ type: 'error', message: t('appDebug.otherError.promptNoBeEmpty'), duration: 3000 });
return;
}
By following these steps and ensuring that the completionPromptConfig
is correctly set and updated, you should be able to resolve the issue where the Prompt field does not save and shows 0 characters when using the Phi3 [completion] model in Dify version 0.6.8.
To continue the conversation, mention @dosu.
I was able to solve the problem by switching the model to a Chat model and switching it back to a Completion model, but I am not sure that is the ideal solution.
Self Checks
Dify version
0.6.8
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
Create a chatbot with Knowledge Retrieval and use phi3 [completion] model for the LLM. Even if you type in the Prompt field, it remains showing 0 characters and the prompt is not saved when closing and reopening that module.
✔️ Expected Behavior
I was expecting the Prompt field to show the character count and have the prompt be saved when closing the module.
❌ Actual Behavior
I believe the prompt is not saved and the chatflow fails because of that.