Open Haxeebraja opened 5 months ago
i think the issue here is with llama tool calling and not with LangGraph
@vbarda Following example demonstrates support for llama 3 however it is using Ollama. https://github.com/langchain-ai/langgraph/blob/main/examples/rag/langgraph_rag_agent_llama3_local.ipynb
Is the issue then in ChatOpenAI/vllm. Do we have an example of vllm/ChatOpenAI?
I use Llama3.1-8B-Instruct and get 400 error: BadRequestError: Error code: 400 - {'object': 'error', 'message': "[{'type': 'extra_forbidden', 'loc': ('body', 'parallel_tool_calls'), 'msg': 'Extra inputs are not permitted', 'input': False}]", 'type': 'BadRequestError', 'param': None, 'code': 400}
I removed the parameter parallel_tool_calls
and succeeds.
The conflict arises from different support of function calling by Llama3.1 and OpenAI, Llama3.1 does not support this parameter while OpenAI does.
https://platform.openai.com/docs/guides/function-calling
Can someone handle this issue?
Facing the same issue, looks like vllm doesn't accept function_call and function as a parameter, but it uses tools
you should modify you llm definiosion to llm = ChatOpenAI( model_name = 'Meta-Llama-3-70B-Instruct', base_url = "http://172.17.0.8:xxxx/v1/", api_key = "EMPTY", temperature=0).bind( response_format={"type": "json_object"},disabled_params={"parallel_tool_calls": None} )
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
I have tried instantiating ChatOpenAI as follows: llm = ChatOpenAI( model_name = 'Meta-Llama-3-70B-Instruct', base_url = "http://172.17.0.8:xxxx/v1/", api_key = "EMPTY", temperature=0)
llm = ChatOpenAI( model_name = 'Meta-Llama-3-70B-Instruct', base_url = "http://172.17.0.8:xxxx/v1/", api_key = "EMPTY", temperature=0).bind( response_format={"type": "json_object"} )
System Info
Meta's llama 3 70B Instruct locally hosted on vllm. ChatOpenAI works fine for other application for example RAG and LCEL