QwenLM / Qwen2.5

Qwen2.5 is the large language model series developed by Qwen team, Alibaba Cloud.
7.75k stars 471 forks source link

[Question]: qwen2.5 agent使用问题 #918

Closed lonngxiang closed 10 hours ago

lonngxiang commented 12 hours ago

Has this been raised before?

Description

运行报错:


response = client.chat.completions.create(
    model="/ai/qwen2.5/",
    messages=[{"role": "user", "content": "1+1="}],

    tools=tools,
    temperature=0.7,
    top_p=0.8,
    max_tokens=512,
    extra_body={
        "repetition_penalty": 1.05,
    },

Choice(finish_reason='stop', index=0, logprobs=None, message=ChatCompletionMessage(content='\n{{"name": "add_numbers", "arguments": {"a": 1, "b": 1}}}\n', role='assistant', function_call=None, tool_calls=[]), stop_reason=None)

image

jklj077 commented 12 hours ago

what is this? is it vllm? what version? how did you start it?

lonngxiang commented 12 hours ago

what is this? is it vllm? what version? how did you start it?

vllm 0.6.1.post2

CUDA_VISIBLE_DEVICES=1 vllm serve  /ai/qwen2.5/ --host 192.16*** --port 10868 --max-model-len 2000 --trust-remote-code --api-key token-abc123 --gpu_memory_utilization 1 --trust-remote-code  --disable-frontend-multiprocessing --tensor-parallel-size 1  --enable-auto-tool-choice --tool-call-parser hermes
jklj077 commented 11 hours ago

Hi, we have identified an issue with the released chat template, which we will fix soon.

For now please try the following template:

{%- if tools %}
    {{- '<|im_start|>system\n' }}
    {%- if messages[0]['role'] == 'system' %}
        {{- messages[0]['content'] }}
    {%- else %}
        {{- 'You are Qwen, created by Alibaba Cloud. You are a helpful assistant.' }}
    {%- endif %}
    {{- "\n\n# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>" }}
    {%- for tool in tools %}
        {{- "\n" }}
        {{- tool | tojson }}
    {%- endfor %}
    {{- "\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call><|im_end|>\n" }}
{%- else %}
    {%- if messages[0]['role'] == 'system' %}
        {{- '<|im_start|>system\n' + messages[0]['content'] + '<|im_end|>\n' }}
    {%- else %}
        {{- '<|im_start|>system\nYou are Qwen, created by Alibaba Cloud. You are a helpful assistant.<|im_end|>\n' }}
    {%- endif %}
{%- endif %}
{%- for message in messages %}
    {%- if (message.role == "user") or (message.role == "system" and not loop.first) or (message.role == "assistant" and not message.tool_calls) %}
        {{- '<|im_start|>' + message.role + '\n' + message.content + '<|im_end|>' + '\n' }}
    {%- elif message.role == "assistant" %}
        {{- '<|im_start|>' + message.role }}
        {%- if message.content %}
            {{- '\n' + message.content }}
        {%- endif %}
        {%- for tool_call in message.tool_calls %}
            {%- if tool_call.function is defined %}
                {%- set tool_call = tool_call.function %}
            {%- endif %}
            {{- '\n<tool_call>\n{"name": "' }}
            {{- tool_call.name }}
            {{- '", "arguments": ' }}
            {{- tool_call.arguments | tojson }}
            {{- '}\n</tool_call>' }}
        {%- endfor %}
        {{- '<|im_end|>\n' }}
    {%- elif message.role == "tool" %}
        {%- if (loop.index0 == 0) or (messages[loop.index0 - 1].role != "tool") %}
            {{- '<|im_start|>user' }}
        {%- endif %}
        {{- '\n<tool_response>\n' }}
        {{- message.content }}
        {{- '\n</tool_response>' }}
        {%- if loop.last or (messages[loop.index0 + 1].role != "tool") %}
            {{- '<|im_end|>\n' }}
        {%- endif %}
    {%- endif %}
{%- endfor %}
{%- if add_generation_prompt %}
    {{- '<|im_start|>assistant\n' }}
{%- endif %}

You can save it to, say, template.jinja and apply it as

vllm ... --enable-auto-tool-choice --tool-call-parser hermes --chat-template template.jinja 
lonngxiang commented 11 hours ago

--chat-template template.jinja

Okay, looking forward to the repair soon.

jklj077 commented 10 hours ago

Hi, should be fixed now, please pull the model files again (updating the tokenizer_config.json is enough). Feel free to reopen if the problem persists.