langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
93.05k stars 14.96k forks source link

Loss of function in component ChatZhipuAI #23868

Open boli-design opened 3 months ago

boli-design commented 3 months ago

Checked other resources

Example Code

  import os

  from langchain_community.tools.tavily_search import TavilySearchResults
  from langchain_community.chat_models import ChatZhipuAI

  from langchain_core.messages import HumanMessage

  # .环境变量
  os.environ["LANGCHAIN_TRACING_V2"] = "true"
  os.environ["LANGCHAIN_API_KEY"] = "balabala"
  os.environ["ZHIPUAI_API_KEY"] = "balabala"
  os.environ["TAVILY_API_KEY"]="balabala"

  llm=ChatZhipuAI(model="glm-4")

  search = TavilySearchResults(max_results=2)

  tools = [search]

  llm_with_tools=llm.bind_tools(tools)   
  response=llm_with_tools.invoke([HumanMessage(content="hello")])
  print(response.content)

  pass

Error Message and Stack Trace (if applicable)

Exception has occurred: NotImplementedError exception: no description File "E:\GitHub\langchain\1\agent_1.py", line 33, in llm_with_tools=llm.bind_tools(tools)
^^^^^^^^^^^^^^^^^^^^^ NotImplementedError:

Description

the code in "langchain_core\language_models\chat_models.py.BaseChatModel.bind_tools" is incomplete, as show below"

    def bind_tools(
        self,
        tools: Sequence[Union[Dict[str, Any], Type[BaseModel], Callable, BaseTool]],
        **kwargs: Any,
    ) -> Runnable[LanguageModelInput, BaseMessage]:
        raise NotImplementedError()

According to the ChatOpenAI, the code should be:

    def bind_tools(
        self,
        tools: Sequence[Union[Dict[str, Any], Type[BaseModel], Callable, BaseTool]],
        *,
        tool_choice: Optional[
            Union[dict, str, Literal["auto", "none", "required", "any"], bool]
        ] = None,
        **kwargs: Any,
    ) -> Runnable[LanguageModelInput, BaseMessage]:
        """Bind tool-like objects to this chat model.

        Assumes model is compatible with OpenAI tool-calling API.

        Args:
            tools: A list of tool definitions to bind to this chat model.
                Can be  a dictionary, pydantic model, callable, or BaseTool. Pydantic
                models, callables, and BaseTools will be automatically converted to
                their schema dictionary representation.
            tool_choice: Which tool to require the model to call.
                Options are:
                name of the tool (str): calls corresponding tool;
                "auto": automatically selects a tool (including no tool);
                "none": does not call a tool;
                "any" or "required": force at least one tool to be called;
                True: forces tool call (requires `tools` be length 1);
                False: no effect;

                or a dict of the form:
                {"type": "function", "function": {"name": <<tool_name>>}}.
            **kwargs: Any additional parameters to pass to the
                :class:`~langchain.runnable.Runnable` constructor.
        """

        formatted_tools = [convert_to_openai_tool(tool) for tool in tools]
        if tool_choice:
            if isinstance(tool_choice, str):
                # tool_choice is a tool/function name
                if tool_choice not in ("auto", "none", "any", "required"):
                    tool_choice = {
                        "type": "function",
                        "function": {"name": tool_choice},
                    }
                # 'any' is not natively supported by OpenAI API.
                # We support 'any' since other models use this instead of 'required'.
                if tool_choice == "any":
                    tool_choice = "required"
            elif isinstance(tool_choice, bool):
                tool_choice = "required"
            elif isinstance(tool_choice, dict):
                tool_names = [
                    formatted_tool["function"]["name"]
                    for formatted_tool in formatted_tools
                ]
                if not any(
                    tool_name == tool_choice["function"]["name"]
                    for tool_name in tool_names
                ):
                    raise ValueError(
                        f"Tool choice {tool_choice} was specified, but the only "
                        f"provided tools were {tool_names}."
                    )
            else:
                raise ValueError(
                    f"Unrecognized tool_choice type. Expected str, bool or dict. "
                    f"Received: {tool_choice}"
                )
            kwargs["tool_choice"] = tool_choice
        return super().bind(tools=formatted_tools, **kwargs)

After replacing this part, the code is running well.

System Info

Python 3.12.3 langchain==0.2.6 langchain-chroma==0.1.2 langchain-community==0.2.6 langchain-core==0.2.10 langchain-huggingface==0.0.3 langchain-openai==0.1.14 langchain-text-splitters==0.2.2 langserve==0.2.2 langsmith==0.1.82

keenborder786 commented 3 months ago

@boli-design you are right, the ChatZhipuAI as per their official documentation support tools invoking. I am going to add this feature which is right now missing.

boli-design commented 3 months ago

In addition, I found another related bug. The code:

import os
import getpass
from langchain_community.tools.tavily_search import TavilySearchResults
from langchain_community.chat_models import ChatZhipuAI
from langchain_core.messages import HumanMessage
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI

# .环境变量
os.environ["LANGCHAIN_TRACING_V2"] = "true"
os.environ["LANGCHAIN_API_KEY"] = "balabala"
os.environ["ZHIPUAI_API_KEY"] = "balabala"
os.environ["TAVILY_API_KEY"]="balabala"

llm_1=ChatZhipuAI(model="glm-4")

llm_2 = ChatOpenAI(
    temperature=0.95,
    model="glm-4",
    openai_api_key="balabala",
    openai_api_base="https://open.bigmodel.cn/api/paas/v4/"
)#

search = TavilySearchResults(max_results=2)
tools = [search]

agent_1 = create_react_agent(llm_1, tools)
agent_2 = create_react_agent(llm_2, tools)

response = agent_2.invoke(
    {"messages": [HumanMessage(content="whats the weather in sf?")]}
)
print("agent_2",response["messages"])

#here error occurs
response = agent_1.invoke(
    {"messages": [HumanMessage(content="whats the weather in sf?")]}
)
print("agent_1",response["messages"])
'''
Exception has occurred: TypeError
Got unknown type 'ToolMessage'.
  File "E:\GitHub\langchain\1\test.py", line 37, in <module>
    response = agent_1.invoke(
               ^^^^^^^^^^^^^^^
TypeError: Got unknown type 'ToolMessage'.
'''

The error: Exception has occurred: TypeError Got unknown type 'ToolMessage'. File "E:\GitHub\langchain\1\test.py", line 37, in response = agent_1.invoke( ^^^^^^^^^^^^^^^ TypeError: Got unknown type 'ToolMessage'.

When using ChatOpenAI, there is no error. I cound not find how to fix it, maybe it is better to use ChatOpenAI to call other models?

yaohanze commented 2 months ago

Hello, is this resolved? I tried to use bind_tools for ChatZhipuAi as well and got the same error. And updating to the latest version of langchain_community doesn't solve the issue. Thank you.

boli-design commented 2 months ago

I sugguest to use ChatOpenAI instead, which could also call the function of zhipuAI llm.

Hello, is this resolved? I tried to use bind_tools for ChatZhipuAi as well and got the same error. And updating to the latest version of langchain_community doesn't solve the issue. Thank you.

yaohanze commented 2 months ago

Thank you for your suggestion. Your idea is to add the openai_api_base when defining the LLM to call GLM4 with ChatOpenAI. Is that correct?

boli-design commented 2 months ago

Thank you for your suggestion. Your idea is to add the openai_api_base when defining the LLM to call GLM4 with ChatOpenAI. Is that correct?

right

yaohanze commented 2 months ago

I just found the instruction on ZhipuAI's website. Thank you.

yaohanze commented 2 months ago

Thank you for your suggestion. Your idea is to add the openai_api_base when defining the LLM to call GLM4 with ChatOpenAI. Is that correct?

right

I find that ChatOpenAI can only support GLM4's temperature as low as 0.01. I have no idea why this is the case. Any temperature lower than 0.01 will cause an error.