modelscope / agentscope

Start building LLM-empowered multi-agent applications in an easier way.
https://doc.agentscope.io/
Apache License 2.0
4.8k stars 294 forks source link

[Bug]:OpenAI Chat API Compatibility Issue with 'qwen-plus' Model #425

Open NJAUzhangwenjun opened 2 weeks ago

NJAUzhangwenjun commented 2 weeks ago

Title: OpenAI Chat API Compatibility Issue with 'qwen-plus' Model

Describe the bug: When attempting to use the 'qwen-plus' model via the OpenAI Chat API compatibility mode, a BadRequestError occurs indicating an invalid parameter for the messages list.

To Reproduce:

  1. Set the OpenAI API key and model configuration.
  2. Initialize the DialogAgent with the specified model configuration.
  3. Execute the script and attempt to interact with the model.
  4. Encounter the error during the model interaction phase.

Expected behavior: The 'qwen-plus' model should process the request without errors and return a valid response when used with the OpenAI Chat API compatibility mode.

Error messages:

openai.BadRequestError: Error code: 400 - {'error': {'code': 'invalid_parameter_error', 'param': None, 'message': '<400> InternalError.Algo.InvalidParameter: The messages list should not contain only one message with role "system". Please include at least one other role in the messages.', 'type': 'invalid_request_error'}, 'id': 'chatcmpl-649c5acb-05c4-9b15-8b44-46efa5a3d896'}

Code:

# -*- coding: utf-8 -*-
"""An example for conversation with OpenAI vision models, especially for
GPT-4o."""
import agentscope
from agentscope.agents import UserAgent, DialogAgent

# Fill in your OpenAI API key
YOUR_OPENAI_API_KEY = "xxx"

model_config = {
    "config_name": "gpt-4o_config",
    "model_type": "openai_chat",
    "model_name": "qwen-plus",
    "api_key": "xxxxx",
    "client_args": {
        "base_url": "https://dashscope.aliyuncs.com/compatible-mode/v1"
    },
    "generate_args": {
        "temperature": 0.7,
    },
}

agentscope.init(
    model_configs=model_config,
    project="Conversation with GPT-4o",
)

# Require user to input URL, and press enter to skip the URL input
user = UserAgent("user", require_url=True)

agent = DialogAgent(
    "Friday",
    sys_prompt="You're a helpful assistant named Friday.",
    model_config_name="gpt-4o_config",
)

x = None
while True:
    x = agent(x)
    x = user(x)
    if x.content == "exit":  # type "exit" to break the loop
        break

Environment:

NJAUzhangwenjun commented 2 weeks ago

Full error log

/Users/zhangwenjun/Documents/javaFiles/agents/ai-wjhub/.venv/bin/python /Users/zhangwenjun/Documents/javaFiles/agents/ai-wjhub/test/an_wjhub/runs/simple_chat.py 
2024-08-28 01:27:36.418 | INFO     | agentscope.models:read_model_configs:186 - Load configs for model wrapper: gpt-4o_config
2024-08-28 01:27:36.441 | INFO     | agentscope.models.model:__init__:201 - Initialize model by configuration [gpt-4o_config]
2024-08-28 01:27:36.468 | WARNING  | agentscope.models.openai_model:__init__:83 - fail to get max_length for qwen-plus: 'Model [qwen-plus] not found in OPENAI_MAX_LENGTH. The last updated date is 20231212'
2024-08-28 01:27:36.469 | INFO     | agentscope.utils.monitor:register_budget:609 - set budget None to qwen-plus
2024-08-28 01:27:36.469 | WARNING  | agentscope.utils.monitor:register_budget:639 - Calculate budgets for model [qwen-plus] is not supported
2024-08-28 01:27:36.470 | INFO     | agentscope.utils.monitor:register:417 - Register metric [qwen-plus.call_counter] to SqliteMonitor with unit [times] and quota [None]
2024-08-28 01:27:36.473 | INFO     | agentscope.utils.monitor:register:417 - Register metric [qwen-plus.prompt_tokens] to SqliteMonitor with unit [token] and quota [None]
2024-08-28 01:27:36.475 | INFO     | agentscope.utils.monitor:register:417 - Register metric [qwen-plus.completion_tokens] to SqliteMonitor with unit [token] and quota [None]
2024-08-28 01:27:36.478 | INFO     | agentscope.utils.monitor:register:417 - Register metric [qwen-plus.total_tokens] to SqliteMonitor with unit [token] and quota [None]
Traceback (most recent call last):
  File "/Users/zhangwenjun/Documents/javaFiles/agents/ai-wjhub/test/an_wjhub/runs/simple_chat.py", line 39, in <module>
    x = agent(x)
        ^^^^^^^^
  File "/Users/zhangwenjun/Documents/javaFiles/agents/ai-wjhub/.venv/lib/python3.11/site-packages/agentscope/agents/agent.py", line 295, in __call__
    res = self.reply(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhangwenjun/Documents/javaFiles/agents/ai-wjhub/.venv/lib/python3.11/site-packages/agentscope/agents/dialog_agent.py", line 74, in reply
    response = self.model(prompt).text
               ^^^^^^^^^^^^^^^^^^
  File "/Users/zhangwenjun/Documents/javaFiles/agents/ai-wjhub/.venv/lib/python3.11/site-packages/agentscope/models/model.py", line 117, in checking_wrapper
    return model_call(self, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhangwenjun/Documents/javaFiles/agents/ai-wjhub/.venv/lib/python3.11/site-packages/agentscope/models/openai_model.py", line 193, in __call__
    response = self.client.chat.completions.create(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhangwenjun/Documents/javaFiles/agents/ai-wjhub/.venv/lib/python3.11/site-packages/openai/_utils/_utils.py", line 274, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhangwenjun/Documents/javaFiles/agents/ai-wjhub/.venv/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 668, in create
    return self._post(
           ^^^^^^^^^^^
  File "/Users/zhangwenjun/Documents/javaFiles/agents/ai-wjhub/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1260, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/zhangwenjun/Documents/javaFiles/agents/ai-wjhub/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 937, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/Users/zhangwenjun/Documents/javaFiles/agents/ai-wjhub/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1041, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'code': 'invalid_parameter_error', 'param': None, 'message': '<400> InternalError.Algo.InvalidParameter: The messages list should not contain only one message with role "system". Please include at least one other role in the messages.', 'type': 'invalid_request_error'}, 'id': 'chatcmpl-649c5acb-05c4-9b15-8b44-46efa5a3d896'}

Process finished with exit code 1
DavdGao commented 2 weeks ago

This error is raised because that the dashscope model service requires at least one user message in the prompt. However, in your code, when the agent speaks first, its prompt only contains one system prompt as follows:

prompt = [
    {"role": "system", "content": "You're a helpful assistant named Friday."}
]

You can adjust the speaking order so that the user agent speaks first.

x = None
while True:
    x = user(x)
    if x.content == "exit":  # type "exit" to break the loop
        break
    x = agent(x)