QwenLM / Qwen2

Qwen2 is the large language model series developed by Qwen team, Alibaba Cloud.
7.04k stars 416 forks source link

langchain+qwen1.5chat+vLLM(Chat) 模型llm bind functions or tools 始终无法触发调用? #137

Closed Songjiadong closed 2 months ago

Songjiadong commented 5 months ago

我的vllm部署命令

python -m vllm.entrypoints.openai.api_server \
    --model=/usr/local/models/Qwen/Qwen1.5-7B-Chat \
    --trust-remote-code \
    --served-model-name qwmiic \
    --host 127.0.0.1 \
    --port 9999 \
    --dtype=half

正常启动后,运行类似代码如下

TOOLS = [ArxivQueryRun()]
    functions = [convert_to_openai_function(t) for t in TOOLS]
    print(functions)
    inference_server_url = "http://127.0.0.1:9999/v1"
    llm = ChatOpenAI(
        model="qwmiic",
        openai_api_key="EMPTY",
        openai_api_base=inference_server_url,
        max_tokens=512,
        temperature=1,
    )
    arxiv = ArxivQueryRun()
    b= arxiv.name
    message=llm.invoke("论文编号:1605.08386",functions=functions)

其中,经过核查,生成的functions格式是规范的,然后arxiv直接invoke可以返回正确文章内容,但是如果和llm结合或chain结合,即便绑定正确如llm.bind(functions=functions)或如上文直接llm.invoke 一概无法触发function调用,都是llm自己说的一段话,请问是否是bug,我该如何修改?

JianxinMa commented 5 months ago

https://github.com/QwenLM/Qwen1.5/issues/15#issuecomment-1933950910 和这个应该是同一个问题。

github-actions[bot] commented 2 months ago

This issue has been automatically marked as inactive due to lack of recent activity. Should you believe it remains unresolved and warrants attention, kindly leave a comment on this thread.