modelscope / modelscope-agent

ModelScope-Agent: An agent framework connecting models in ModelScope with the world
https://modelscope-agent.readthedocs.io/en/latest/
Apache License 2.0
2.63k stars 301 forks source link

Bad example: With Modelscope-Agent-Server, Qwen2 could be used by OpenAI SDK with tool calling ability, please find detail in doc. #482

Closed monk-after-90s closed 3 months ago

monk-after-90s commented 3 months ago

Initial Checks

What happened + What you expected to happen

  1. I followed the doc step by step until “服务curl调用”

  2. expected behavior:

{
  "request_id": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "message": "",
  "output": null,
  "id": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "好的,我已经调用了amap_weather工具查询了海淀区的天气情况。现在,让我为您展示一下查询结果吧。\n\n工具调用\nAction: amap_weather\nAction Input: {\"location\": \"海淀区\"}\n",
        "tool_calls": [
          {
            "type": "function",
            "function": {
              "name": "amap_weather",
              "arguments": "{\"location\": \"海淀区\"}"
            }
          }
        ]
      },
      "finish_reason": "tool_calls"
    }
  ],
  "created": 1717485704,
  "model": "Qwen2-7B-Instruct",
  "system_fingerprint": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "object": "chat.completion",
  "usage": {
    "prompt_tokens": 237,
    "completion_tokens": 48,
    "total_tokens": 285
  }
}
  1. In fact, I got "Internal Server Error". And the server log shows:
    INFO:     127.0.0.1:56998 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
    ERROR:    Exception in ASGI application
    Traceback (most recent call last):
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
    await route.handle(scope, receive, send)
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/mnt/mpatha/code/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 151, in chat_completion
    agent = RolePlay(function_list=None, llm=llm_config, uuid_str=user)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/agents/role_play.py", line 145, in __init__
    Agent.__init__(self, function_list, llm, storage_path, name,
    File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/agent.py", line 43, in __init__
    self.llm = get_chat_model(**self.llm_config)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/llm/__init__.py", line 23, in get_chat_model
    return LLM_REGISTRY[model_server](model, model_server, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    TypeError: VllmLLM.__init__() missing 3 required positional arguments: 'llm', 'tokenizer', and 'sampling_params'

Versions / Dependencies

modelscope-agent==0.6.1 Python 3.11.9 Ubuntu22.04 cuda_12.3 Driver Version: 555.52.04

Reproduction script

Follow doc as my environment.

Issue Severity

High: It blocks me from completing my task.

njhouse365 commented 3 months ago

2024-06-12 14:08:05,926 - INFO - HTTP Request: POST http://192.168.204.120:8087/v1/chat/completions "HTTP/1.1 400 Bad Request" INFO: 192.168.204.99:38768 - "POST /query_agent HTTP/1.1" 500 Internal Server Error ERROR: Exception in ASGI application Traceback (most recent call last): File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi result = await app( # type: ignore[func-returns-value] File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call return await self.app(scope, receive, send) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call await super().call(scope, receive, send) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/applications.py", line 123, in call await self.middleware_stack(scope, receive, send) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call raise exc File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call await self.app(scope, receive, _send) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in call await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/routing.py", line 756, in call await self.middleware_stack(scope, receive, send) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/routing.py", line 776, in app await route.handle(scope, receive, send) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle await self.app(scope, receive, send) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/routing.py", line 77, in app await wrap_app_handling_exceptions(app, request)(scope, receive, send) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/routing.py", line 72, in app response = await func(request) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app raw_response = await run_endpoint_function( File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/fastapi/routing.py", line 193, in run_endpoint_function return await run_in_threadpool(dependant.call, *values) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/starlette/concurrency.py", line 42, in run_in_threadpool return await anyio.to_thread.run_sync(func, args) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync return await get_async_backend().run_sync_in_worker_thread( File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread return await future File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run result = context.run(func, args) File "/home/house365ai/xxm/modelscope-agent2/examples/agent_server2.py", line 49, in query_agent for chunk in response: File "/home/house365ai/xxm/modelscope-agent2/modelscope_agent/agents/role_play.py", line 289, in _run for s in output: File "/home/house365ai/xxm/modelscope-agent2/modelscope_agent/llm/openai.py", line 42, in _chat_stream response = self.client.chat.completions.create( File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/openai/_utils/_utils.py", line 277, in wrapper return func(args, **kwargs) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 606, in create return self._post( File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/openai/_base_client.py", line 1240, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/openai/_base_client.py", line 921, in request return self._request( File "/home/house365ai/anaconda3/envs/agent2/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request raise self._make_status_error_from_response(err.response) from None

dahaipeng commented 3 months ago

Initial Checks

  • [x] I have searched GitHub for a duplicate issue and I'm sure this is something new
  • [x] I have read and followed the docs & demos and still think this is a bug
  • [x] I am confident that the issue is with modelscope-agent (not my code, or another library in the ecosystem)

What happened + What you expected to happen

  1. I followed the doc step by step until “服务curl调用”
  2. expected behavior:
{
  "request_id": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "message": "",
  "output": null,
  "id": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "好的,我已经调用了amap_weather工具查询了海淀区的天气情况。现在,让我为您展示一下查询结果吧。\n\n工具调用\nAction: amap_weather\nAction Input: {\"location\": \"海淀区\"}\n",
        "tool_calls": [
          {
            "type": "function",
            "function": {
              "name": "amap_weather",
              "arguments": "{\"location\": \"海淀区\"}"
            }
          }
        ]
      },
      "finish_reason": "tool_calls"
    }
  ],
  "created": 1717485704,
  "model": "Qwen2-7B-Instruct",
  "system_fingerprint": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "object": "chat.completion",
  "usage": {
    "prompt_tokens": 237,
    "completion_tokens": 48,
    "total_tokens": 285
  }
}
  1. In fact, I got "Internal Server Error". And the server log shows:
INFO:     127.0.0.1:56998 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 151, in chat_completion
    agent = RolePlay(function_list=None, llm=llm_config, uuid_str=user)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/agents/role_play.py", line 145, in __init__
    Agent.__init__(self, function_list, llm, storage_path, name,
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/agent.py", line 43, in __init__
    self.llm = get_chat_model(**self.llm_config)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/llm/__init__.py", line 23, in get_chat_model
    return LLM_REGISTRY[model_server](model, model_server, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: VllmLLM.__init__() missing 3 required positional arguments: 'llm', 'tokenizer', and 'sampling_params'

Versions / Dependencies

modelscope-agent==0.6.1 Python 3.11.9 Ubuntu22.04 cuda_12.3 Driver Version: 555.52.04

Reproduction script

Follow doc as my environment.

Issue Severity

High: It blocks me from completing my task.

Initial Checks

  • [x] I have searched GitHub for a duplicate issue and I'm sure this is something new
  • [x] I have read and followed the docs & demos and still think this is a bug
  • [x] I am confident that the issue is with modelscope-agent (not my code, or another library in the ecosystem)

What happened + What you expected to happen

  1. I followed the doc step by step until “服务curl调用”
  2. expected behavior:
{
  "request_id": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "message": "",
  "output": null,
  "id": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "好的,我已经调用了amap_weather工具查询了海淀区的天气情况。现在,让我为您展示一下查询结果吧。\n\n工具调用\nAction: amap_weather\nAction Input: {\"location\": \"海淀区\"}\n",
        "tool_calls": [
          {
            "type": "function",
            "function": {
              "name": "amap_weather",
              "arguments": "{\"location\": \"海淀区\"}"
            }
          }
        ]
      },
      "finish_reason": "tool_calls"
    }
  ],
  "created": 1717485704,
  "model": "Qwen2-7B-Instruct",
  "system_fingerprint": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "object": "chat.completion",
  "usage": {
    "prompt_tokens": 237,
    "completion_tokens": 48,
    "total_tokens": 285
  }
}
  1. In fact, I got "Internal Server Error". And the server log shows:
INFO:     127.0.0.1:56998 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 151, in chat_completion
    agent = RolePlay(function_list=None, llm=llm_config, uuid_str=user)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/agents/role_play.py", line 145, in __init__
    Agent.__init__(self, function_list, llm, storage_path, name,
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/agent.py", line 43, in __init__
    self.llm = get_chat_model(**self.llm_config)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/llm/__init__.py", line 23, in get_chat_model
    return LLM_REGISTRY[model_server](model, model_server, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: VllmLLM.__init__() missing 3 required positional arguments: 'llm', 'tokenizer', and 'sampling_params'

Versions / Dependencies

modelscope-agent==0.6.1 Python 3.11.9 Ubuntu22.04 cuda_12.3 Driver Version: 555.52.04

Reproduction script

Follow doc as my environment.

Issue Severity

High: It blocks me from completing my task.

Initial Checks

  • [x] I have searched GitHub for a duplicate issue and I'm sure this is something new
  • [x] I have read and followed the docs & demos and still think this is a bug
  • [x] I am confident that the issue is with modelscope-agent (not my code, or another library in the ecosystem)

What happened + What you expected to happen

  1. I followed the doc step by step until “服务curl调用”
  2. expected behavior:
{
  "request_id": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "message": "",
  "output": null,
  "id": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "好的,我已经调用了amap_weather工具查询了海淀区的天气情况。现在,让我为您展示一下查询结果吧。\n\n工具调用\nAction: amap_weather\nAction Input: {\"location\": \"海淀区\"}\n",
        "tool_calls": [
          {
            "type": "function",
            "function": {
              "name": "amap_weather",
              "arguments": "{\"location\": \"海淀区\"}"
            }
          }
        ]
      },
      "finish_reason": "tool_calls"
    }
  ],
  "created": 1717485704,
  "model": "Qwen2-7B-Instruct",
  "system_fingerprint": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "object": "chat.completion",
  "usage": {
    "prompt_tokens": 237,
    "completion_tokens": 48,
    "total_tokens": 285
  }
}
  1. In fact, I got "Internal Server Error". And the server log shows:
INFO:     127.0.0.1:56998 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 151, in chat_completion
    agent = RolePlay(function_list=None, llm=llm_config, uuid_str=user)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/agents/role_play.py", line 145, in __init__
    Agent.__init__(self, function_list, llm, storage_path, name,
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/agent.py", line 43, in __init__
    self.llm = get_chat_model(**self.llm_config)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/llm/__init__.py", line 23, in get_chat_model
    return LLM_REGISTRY[model_server](model, model_server, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: VllmLLM.__init__() missing 3 required positional arguments: 'llm', 'tokenizer', and 'sampling_params'

Versions / Dependencies

modelscope-agent==0.6.1 Python 3.11.9 Ubuntu22.04 cuda_12.3 Driver Version: 555.52.04

Reproduction script

Follow doc as my environment.

Issue Severity

High: It blocks me from completing my task.

Initial Checks

  • [x] I have searched GitHub for a duplicate issue and I'm sure this is something new
  • [x] I have read and followed the docs & demos and still think this is a bug
  • [x] I am confident that the issue is with modelscope-agent (not my code, or another library in the ecosystem)

What happened + What you expected to happen

  1. I followed the doc step by step until “服务curl调用”
  2. expected behavior:
{
  "request_id": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "message": "",
  "output": null,
  "id": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "好的,我已经调用了amap_weather工具查询了海淀区的天气情况。现在,让我为您展示一下查询结果吧。\n\n工具调用\nAction: amap_weather\nAction Input: {\"location\": \"海淀区\"}\n",
        "tool_calls": [
          {
            "type": "function",
            "function": {
              "name": "amap_weather",
              "arguments": "{\"location\": \"海淀区\"}"
            }
          }
        ]
      },
      "finish_reason": "tool_calls"
    }
  ],
  "created": 1717485704,
  "model": "Qwen2-7B-Instruct",
  "system_fingerprint": "chatcmpl_3f020464-e98d-4c7b-8717-9fca56784fe6",
  "object": "chat.completion",
  "usage": {
    "prompt_tokens": 237,
    "completion_tokens": 48,
    "total_tokens": 285
  }
}
  1. In fact, I got "Internal Server Error". And the server log shows:
INFO:     127.0.0.1:56998 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/anxu/.conda/envs/qwen2/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent_servers/assistant_server/api.py", line 151, in chat_completion
    agent = RolePlay(function_list=None, llm=llm_config, uuid_str=user)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/agents/role_play.py", line 145, in __init__
    Agent.__init__(self, function_list, llm, storage_path, name,
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/agent.py", line 43, in __init__
    self.llm = get_chat_model(**self.llm_config)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/mnt/mpatha/code/modelscope-agent/modelscope_agent/llm/__init__.py", line 23, in get_chat_model
    return LLM_REGISTRY[model_server](model, model_server, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: VllmLLM.__init__() missing 3 required positional arguments: 'llm', 'tokenizer', and 'sampling_params'

Versions / Dependencies

modelscope-agent==0.6.1 Python 3.11.9 Ubuntu22.04 cuda_12.3 Driver Version: 555.52.04

Reproduction script

Follow doc as my environment.

Issue Severity

High: It blocks me from completing my task.

Thank you for raising this issue. Upon review, we have identified a potential cause: the model_server in this script conflicts with the vllmLLM class in the llm module. We are currently working on a solution and will release a PR to address this conflict shortly. @zzhangpurdue

zzhangpurdue commented 3 months ago

sovled in #485