langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
91.2k stars 14.49k forks source link

LLM with_fallbacks function not work with create_tool_calling_agent #20499

Open eav-solution opened 4 months ago

eav-solution commented 4 months ago

Checked other resources

Example Code

self.llm = self.llm.with_fallbacks(fallbackModels)

self.agent = create_tool_calling_agent(self.llm.llm, self.tools, self.promptTemplate.getAgentPrompt(self.tools))

Error Message and Stack Trace (if applicable)

self.agent = create_tool_calling_agent(self.llm.llm, self.tools, self.promptTemplate.getAgentPrompt(self.tools))
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/langchain/agents/tool_calling_agent/base.py", line 85, in create_tool_calling_agent raise ValueError( ValueError: This function requires a .bind_tools method be implemented on the LLM.

Description

My code work well create_tool_calling_agent without call with_fallbacks function.

System Info

langchain==0.1.16 langchain-community==0.0.33 langchain-core==0.1.43 langchain-experimental==0.0.49 langchain-google-genai==1.0.1 langchain-openai==0.1.3 langchain-text-splitters==0.0.1 langchainhub==0.1.14

platform linux python 3.11

liugddx commented 4 months ago

Currently only some llms support the bind_tools method

sepiatone commented 4 months ago

This might be an issue with attributes being "hidden" by Runnable methods (here with_fallbacks)

from langchain_openai import ChatOpenAI
from langchain_groq import ChatGroq

model_1 = ChatOpenAI().with_fallbacks([ChatGroq()])
model_2 = ChatOpenAI()

print(hasattr(model_1, "bind_tools"))
print(hasattr(model_2, "bind_tools"))

with response

False
True

I think @eyurtsev mentioned this somewhere, but I can't find it now.

eav-solution commented 4 months ago

Could you please tell me any workaround method for this issue ?

ninikolov commented 3 months ago

I am having the same problem with the AzureChatOpenAI model, bind_tools fails even though here https://python.langchain.com/docs/integrations/chat/ it says tools should be implemented for this model. Any workaround?

maikishore commented 3 months ago

Same Error ,tried workaround but it persists. ValueError: This function requires a .bind_tools method be implemented on the LLM.

bmwise14 commented 1 month ago

Has this been resolved or is there a workaround for this? I too am experiencing this error, specifically the NotImplementedError for AzureChatOpenAI

baskaryan commented 1 month ago

I believe this was fixed in https://github.com/langchain-ai/langchain/pull/22139, could you confirm if you still see this issue with lanchain-core>=0.2.2?

Screenshot 2024-07-09 at 9 52 12 PM