Open buptstehc opened 2 months ago
Hi,
I can indeed reproduce this with Qwen2-7B-Instruct and Qwen2-72B-Instruct with your original setup.
However, I think that a little prompt engineering could help for Qwen2-72B-Instruct:
P.S: the value of function parameters
is normally a JSON Schema.
Hi,
I can indeed reproduce this with Qwen2-7B-Instruct and Qwen2-72B-Instruct with your original setup.
However, I think that a little prompt engineering could help for Qwen2-72B-Instruct:
P.S: the value of function
parameters
is normally a JSON Schema.
Qwen2 7B cannot achieve such effects even with prompt engineering, possibly due to the model itself?
Hi,
For Qwen2-7B-Instruct, prompt engineering also works, and you just need to experiment (this is related to your bussiness logic, though; you should adjust this on your own):
If you find prompt engineering hard, there are abundant materials online.
借楼问一下qwen2-7b-instruct和qwen2-72b-instruct是否原生支持Tools(function calling的升级版),没有看到相关的信息? 如果支持的话,我调用自己部署的qwen2-7b-instruct和qwen2-72b-instruct(从HF上下载的)的Tools,和调用阿里云百炼-模型中心-灵积里的qwen2-72b-instruct的Tools效果是一样的么?
I meet the same problem and no prompt could take effect.
Step 1: send the conversation and available functions to the model
messages = [ { 'role': 'user', 'content': "呼叫" }]
functions = [ { 'name': 'call', 'description': '呼叫一个或多个人员', 'parameters': { 'type': 'object', 'properties': { 'names': { 'type': 'list', 'description': '姓名列表', } }, 'required': ['names'], }, } ]
responses = llm.chat(messages=messages, functions=functions, stream=False, extra_generate_cfg=dict(function_choice='auto')) print(responses)