Closed xielong closed 3 months ago
@xielong Hello there! I'm here to help you with any bugs, questions, or contributions you may have. I'm a bot designed to assist you while you wait for a human maintainer. Let's tackle this issue together!
To resolve the "Bad Request Error, Invalid model: ernie-4.0-8k-Latest" issue when using the "Ernie-4.0-8K-Latest" model in Dify version 0.6.14, which is self-hosted using Docker, ensure that the model configuration in your YAML file is correctly defined as shown below:
model: ernie-4.0-8k
label:
en_US: Ernie-4.0-8K
model_type: llm
features:
- agent-thought
model_properties:
mode: chat
context_size: 8192
parameter_rules:
- name: temperature
use_template: temperature
min: 0.1
max: 1.0
default: 0.8
- name: top_p
use_template: top_p
- name: max_tokens
use_template: max_tokens
default: 1024
min: 2
max: 2048
- name: presence_penalty
use_template: presence_penalty
default: 1.0
min: 1.0
max: 2.0
- name: frequency_penalty
use_template: frequency_penalty
- name: response_format
use_template: response_format
- name: disable_search
label:
zh_Hans: 禁用搜索
en_US: Disable Search
type: boolean
help:
zh_Hans: 禁用模型自行进行外部搜索。
en_US: Disable the model to perform external search.
required: false
Make sure that the model name ernie-4.0-8k
is correctly referenced in your application and that there are no typos or mismatches in the model name [1][2].
To continue talking to Dosu, mention @dosu.
Self Checks
Dify version
0.6.14
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
Choose the "Ernie-4.0-8K -Latest" model
Send any question to the model, an error will be encountered
✔️ Expected Behavior
There is a normal response
❌ Actual Behavior
[wenxin] Bad Request Error, Invalid model: ernie-4.0-8k-Latest