Closed DavdGao closed 1 month ago
LGTM, but I am not sure if all LLM services support setting system prompts (i.e., providing {"role": "system", "content": "xxx"} ).
The involved model APIs in this PR include Ollama, DashScope, LiteLLM, Yi and Zhipu. Currently, we are sure Ollama, DashScope, Yi and Zhipu support system prompt. While LiteLLM will handle the system prompt within its library (If not supported, it will convert system message into user message)
Description
I found the current format strategy leads to misunderstanding.
For example, the following formatted message will be inserted a system prompt automatically by the API provider, and the LLM will refuse to act as the new role, e.g. "Friday"
With the above prompt, Qwen-max responses "I'm Qwen, not Friday."
Solution
We check if there is a system prompt, and put it at the beginning of the formatted prompt.
Checklist
Please check the following items before code is ready to be reviewed.