Open FanZhang91 opened 1 month ago
MARK! need fix +1
+1
+1
+1
function call api, plz
主要就是你需要在 tool_choice 那里写上你函数的名字,你的代码就是 tool_choice=“get_current_weather”
the same problem,
+1
mark +1
v0.5.4 the same problem,
Trying Llama 3.1, the same behavior occurs with "tool_choice": "required"
. If tool_choice
is omitted altogether, the output is in text form, e.g.,
Based on the given prompt, I will call the "search_movies" function with the required parameters. Here's the JSON for the function call:
{
"name": "search_movies",
"parameters": {
"genre": "comedy",
"query_string": "2015 comedy movies",
"year": 2015
}
}
Happy to try to debug this further.
qwen2, same error
openai.BadRequestError: Error code: 400 - {'object': 'error', 'message': "[{'type': 'extra_forbidden', 'loc': ('body', 'tool_choice'), 'msg': 'Extra inputs are not permitted', 'input': 'auto'}, {'type': 'extra_forbidden', 'loc': ('body', 'tools'), 'msg': 'Extra inputs are not permitted', 'input': [{'type': 'function', 'function': {'name': 'get_weather', 'description': '获取某个地点的天气,用户应首先提供一个地点', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': '城市,例如:北京'}}, 'required': ['location']}}}, {'type': 'function', 'function': {'name': 'get_time', 'description': '获取当前时间,给一个地方点', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': '城市,例如:上海'}}, 'required': ['location']}}}]}]", 'type': 'BadRequestError', 'param': None, 'code': 400}
hello, any update from here, i have same problem ,'type': 'value_error', 'loc': ('body',), 'msg': 'Value error, Currently only named tools are supported.'
+1
same problem here.
Hi guys, I tried writing the following code snippet, which should help resolve the error mentioned above.
"tools": [
{
"function": {
"name": "query_weather",
"description": "Provides weather information for a specified location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The location for which to retrieve weather information."
},
},
"required": ["location"]
}
}
}
],
"tool_choice": {
"function": {
"name": "query_weather"
},
"type": "function"
},
Hi guys, I tried writing the following code snippet, which should help resolve the error mentioned above.
"tools": [ { "function": { "name": "query_weather", "description": "Provides weather information for a specified location.", "parameters": { "type": "object", "properties": { "location": { "type": "string", "description": "The location for which to retrieve weather information." }, }, "required": ["location"] } } } ], "tool_choice": { "function": { "name": "query_weather" }, "type": "function" },
Is there a complete code example? what is tool_choice?
Your current environment
Device: Nvidia GeForce 4090 software: vllm 0.5.2 + openai 1.30.5 + transformes 4.42.4
🐛 Describe the bug
I use OpenAI api and vllm to deploy local Qwen2 llm, But vllm function call mode does not work. The OpenAI interface correctly passed the tools info parameters to vllm, but vllm did not use it. If I enable ' tool_choice="auto" 'parameter, I will encounter with 400 error code.
---------------------------------------------------------------------server script------------------------------------------------------------- python entrypoints/openai/api_server.py --model="xxx/Qwen2-1.5B-Instruct" --trust-remote-code --host "localhost" --port 8000 --dtype auto -------------------------------------------------------------client code ------------------------------------------------------------------
from openai import OpenAI
tools = [ { "type": "function", "function": { "name": "get_current_weather", "description": "Get the current weather in a given location", "parameters": { "type": "object", "properties": { "location": { "type": "string", "description": "The city and state, e.g. San Francisco, CA", }, "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}, }, "required": ["location"], }, }, } ]
openai_api_key = "xxx" openai_api_base = "http://localhost:8000/v1/"
client = OpenAI( api_key=openai_api_key, base_url=openai_api_base, )
models = client.models.list() model = models.data[0].id
chat_completion = client.chat.completions.create( messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "help me query currnet weather in BeiJing."}, ], model=model, tools=tools,
tool_choice="auto"
)
print("response: ", chat_completion.choices[0].message)
--------------------------------------------extra info && response ----------------------------------------------
response: ChatCompletionMessage(content='xxx', role='assistant', function_call=None, tool_calls=[])
---------------------------------- enable tool_choice="auto" parameter ---------------------------------------- openai.BadRequestError: Error code: 400 - {'object': 'error', 'message': "[{'type': 'value_error', 'loc': ('body',), 'msg': 'Value error, Currently only named tools are supported.', 'input': {'messages': [{'role': 'system', 'content': 'You are a helpful assistant.'}, {'role': 'user', 'content': 'help me query currnet weather in San Francisco.'}], 'model': '/home/zhangfan/deep_learning/Qwen2/examples/sft/Qwen2-1.5B-Instruct', 'tool_choice': 'auto', 'tools': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}]}, 'ctx': {'error': ValueError('Currently only named tools are supported.')}}]", 'type': 'BadRequestError', 'param': None, 'code': 400}