Closed 147258369777 closed 1 month ago
bind_functions 未生效?
bind_functions 未生效? 就挺奇怪的刚刚能用,现在又出现这种错误了
请问有办法让supervisor不止返回next的决定,同时返回做决定谁是next的思考过程吗?
Any idea how to not only let supervior return who should act next, but return the thinking steps why the next role is chosen?
This example uses the JsonOutputFunctionsParser:
from langchain.output_parsers.openai_functions import JsonOutputFunctionsParser
Is the problem with the parser, or is the example in these notebooks using the wrong type of parser? I also am having issues running those example notebooks under langgraph/examples/multi_agent. I consistently get this same error.
我也遇到同样的问题,https://github.com/QwenLM/Qwen2/issues/568 还以为是我用的qwen模型的问题,结果我换成openai的key也不行,未调用function
你使用哪种integration来运行qwen?VLLM吗?
你使用哪种integration来运行qwen?VLLM吗?
是的,有关系么,后来换成chatgpt3.5的api,可以function_call了,不过时好时坏,概率事件。。
你使用哪种integration来运行qwen?VLLM吗?
是的,有关系么,后来换成chatgpt3.5的api,可以function_call了,不过时好时坏,概率事件。。
我用的gpt4的api,用ChatOpenAI类包了一层,在bind_function时有个参数叫‘function_call’,三种传参方式,{'none','auto','方法名'}。none,百分百不调用。auto,LLM自行决定是否调用。方法名,百分百调用‘方法名’。
me too, that's a problem
me too, that's a problem
这个报错是大模型根本没有调用function_call,还是用LLM自身的语言能力回答了你。
我已找到解决办法。
如果你跟我一样是用Langchain的ChatOpenAI类包了一层,那么你要进入这个类的代码,在_stream或者_generate方法里面,指定function参数为function_lst。bind_functions、bind_tools仅兼容支持OPENAI function calling API的模型,很明显,用千问什么的肯定是不支持的。
def _generate(
self,
messages: List[BaseMessage],
stop: Optional[List[str]] = None,
run_manager: Optional[CallbackManagerForLLMRun] = None,
stream: Optional[bool] = None,
**kwargs: Any,
) -> ChatResult:
should_stream = stream if stream is not None else self.streaming
if should_stream:
stream_iter = self._stream(
messages, stop=stop, run_manager=run_manager, **kwargs
)
return generate_from_stream(stream_iter)
message_dicts, params = self._create_message_dicts(messages, stop)
params = {
**params,
**({"stream": stream} if stream is not None else {}),
**kwargs,
}
functions = [
{
"name": "get_current_weather",
"description": "Get the current weather in a given location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
},
"required": ["location"],
},
}
]
formatted_functions = [convert_to_openai_function(fn) for fn in functions]
response = self.client.create(messages=message_dicts,functions=formatted_functions , **params)
# response = self.client.create(messages=message_dicts, **params)
return self._create_chat_result(response)
me too, that's a problem
这个报错是大模型根本没有调用function_call,还是用LLM自身的语言能力回答了你。
我已找到解决办法。
如果你跟我一样是用Langchain的ChatOpenAI类包了一层,那么你要进入这个类的代码,在_stream或者_generate方法里面,指定function参数为function_lst。bind_functions、bind_tools仅兼容支持OPENAI function calling API的模型,很明显,用千问什么的肯定是不支持的。
def _generate( self, messages: List[BaseMessage], stop: Optional[List[str]] = None, run_manager: Optional[CallbackManagerForLLMRun] = None, stream: Optional[bool] = None, **kwargs: Any, ) -> ChatResult: should_stream = stream if stream is not None else self.streaming if should_stream: stream_iter = self._stream( messages, stop=stop, run_manager=run_manager, **kwargs ) return generate_from_stream(stream_iter) message_dicts, params = self._create_message_dicts(messages, stop) params = { **params, **({"stream": stream} if stream is not None else {}), **kwargs, } functions = [ { "name": "get_current_weather", "description": "Get the current weather in a given location.", "parameters": { "type": "object", "properties": { "location": { "type": "string", "description": "The city and state, e.g. San Francisco, CA", }, "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}, }, "required": ["location"], }, } ] formatted_functions = [convert_to_openai_function(fn) for fn in functions] response = self.client.create(messages=message_dicts,functions=formatted_functions , **params) # response = self.client.create(messages=message_dicts, **params) return self._create_chat_result(response)
'ChatOpenAI' object has no attribute '_create_message_dicts' 为什么我报错这个呀
我也遇到同样的问题,QwenLM/Qwen2#568 还以为是我用的qwen模型的问题,结果我换成openai的key也不行,未调用function
i found bind_functions only supported in ChatOpenAI ?
I wouldn't recommend using .bind_functions
and instead use .bind_tools
. We're also working on removing any mentions of .bind_functions
from LangGraph documentation, so going to close this issue.
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Error Message and Stack Trace (if applicable)
raise OutputParserException(f"Could not parse function call: {exc}") langchain_core.exceptions.OutputParserException: Could not parse function call: 'function_call'
Description
This is a problem that I encountered while studying Langgraph's multi-agent blog. This problem not only occurs here, but also when I was studying Langgraph's Planning Agent Examples module
System Info
Latest version