Open Howe829 opened 1 month ago
The issue you're encountering with the suggested-question
API returning an empty list and the ValueError
related to the "Model Parameter temperature should be greater than or equal to 0.1" is likely due to the temperature parameter being set below the minimum allowed value of 0.1 in the model configuration.
In Dify, the min temperature
parameter for the Wenxin Ernie series models is set to 0.1, which means that any temperature value provided must be at least 0.1 [1]. The error you're seeing suggests that the temperature value being used is below this threshold, causing the validation to fail in the _validate_and_filter_model_parameters
function [2].
To resolve this issue, ensure that the temperature parameter in your model configuration is set to 0.1 or higher. You can check the configuration where the model parameters are defined and adjust the temperature value accordingly. This should allow the suggested-question
API to function correctly and return the expected results.
To continue talking to Dosu, mention @dosu.
Self Checks
Dify version
0.9.2
Cloud or Self Hosted
Self Hosted (Source)
Steps to reproduce
When utilizing models from the Wenxin Ernie series as the default model in our implementation, with a specific focus on configuring the min temperature parameter to be 0.1 in the parameter_rules.
The following error occurs when attempting to call the suggested-question api:.
This issue is caused by a problem in the code located at line 96 of the llm_generator.py file https://github.com/langgenius/dify/blob/7d3dad3d1d2c64379c24184faa22e2ce1ad26490/api/core/llm_generator/llm_generator.py#L96
✔️ Expected Behavior
return suggested questions.
❌ Actual Behavior
empty list