langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
88.46k stars 13.89k forks source link

OllamaFunctions does not work - Received unsupported message type for Ollama #20924

Open baswenneker opened 2 months ago

baswenneker commented 2 months ago

Checked other resources

Example Code

from langchain_community.tools import DuckDuckGoSearchResults
from langchain import hub
from langchain.agents import create_openai_functions_agent
from langchain_experimental.llms.ollama_functions import OllamaFunctions
from langgraph.prebuilt import create_agent_executor

tools = [DuckDuckGoSearchResults(max_results=3)]

# llm = OllamaFunctions(model="mixtral")
llm = OllamaFunctions(model="llama3:latest")
prompt = hub.pull("hwchase17/openai-functions-agent")

# Construct the OpenAI Functions agent
agent_runnable = create_openai_functions_agent(llm, tools, prompt)

agent_executor = create_agent_executor(agent_runnable, tools)
agent_executor.invoke(
    {"input": "who is the winnner of the us open", "chat_history": []}
)

Error Message and Stack Trace (if applicable)

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[1], [line 21](vscode-notebook-cell:?execution_count=1&line=21)
     [18](vscode-notebook-cell:?execution_count=1&line=18) agent_runnable = create_openai_functions_agent(llm, tools, prompt)
     [20](vscode-notebook-cell:?execution_count=1&line=20) agent_executor = create_agent_executor(agent_runnable, tools)
---> [21](vscode-notebook-cell:?execution_count=1&line=21) agent_executor.invoke(
     [22](vscode-notebook-cell:?execution_count=1&line=22)     {"input": "who is the winnner of the us open", "chat_history": []}
     [23](vscode-notebook-cell:?execution_count=1&line=23) )

...

    [152](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:152) def _create_chat_stream(
    [153](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:153)     self,
    [154](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:154)     messages: List[BaseMessage],
    [155](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:155)     stop: Optional[List[str]] = None,
    [156](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:156)     **kwargs: Any,
    [157](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:157) ) -> Iterator[str]:
    [158](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:158)     payload = {
    [159](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:159)         "model": self.model,
--> [160](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:160)         "messages": self._convert_messages_to_ollama_messages(messages),
    [161](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:161)     }
    [162](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:162)     yield from self._create_stream(
    [163](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:163)         payload=payload, stop=stop, api_url=f"{self.base_url}[/api/chat](https://file+.vscode-resource.vscode-cdn.net/api/chat)", **kwargs
    [164](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:164)     )

File [~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:112](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:112), in ChatOllama._convert_messages_to_ollama_messages(self, messages)
    [110](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:110)     role = "system"
    [111](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:111) else:
--> [112](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:112)     raise ValueError("Received unsupported message type for Ollama.")
    [114](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:114) content = ""
    [115](https://file+.vscode-resource.vscode-cdn.net/Users/bas/Development/HeadingFWD/langchain-playground/notebooks/langgraph/~/Development/HeadingFWD/langchain-playground/.venv/lib/python3.11/site-packages/langchain_community/chat_models/ollama.py:115) images = []

ValueError: Received unsupported message type for Ollama.

Description

I'm trying to get Function Calls working with OllamaFunctions. I tried this with several different models btw, mistral, llama3, dolphincoder, mixtral:8x22b. It will always respond with:

ValueError: Received unsupported message type for Ollama.

I've found these issues as well that might be related: https://github.com/langchain-ai/langchain/issues/14360 https://github.com/langchain-ai/langchain/pull/20881

I found that OllamaFunctions returns a FunctionMessage but _convert_messages_to_ollama_messages from OllamaChat doesn't recognize that and is not able to call the function.

https://github.com/langchain-ai/langchain/blob/4c437ebb9c2fb532ce655ac1e0c354c82a715df7/libs/community/langchain_community/chat_models/ollama.py#L99

Any help would be greatly appreciated.

System Info

python3 -m langchain_core.sys_info

System Information
------------------
> OS:  Darwin
> OS Version:  Darwin Kernel Version 23.4.0: Fri Mar 15 00:12:37 PDT 2024; root:xnu-10063.101.17~1/RELEASE_ARM64_T6031
> Python Version:  3.11.8 (v3.11.8:db85d51d3e, Feb  6 2024, 18:02:37) [Clang 13.0.0 (clang-1300.0.29.30)]

Package Information
-------------------
> langchain_core: 0.1.46
> langchain: 0.1.16
> langchain_community: 0.0.34
> langsmith: 0.1.49
> langchain_experimental: 0.0.57
> langchain_openai: 0.1.3
> langchain_text_splitters: 0.0.1
> langchainhub: 0.1.15
> langgraph: 0.0.39

Packages not installed (Not Necessarily a Problem)
--------------------------------------------------
The following packages were not found:

> langserve
solarslurpi commented 2 months ago

Thank you! I appreciate your comment on the ollama github.

baswenneker commented 2 months ago

I haven't tried yet, but this could be a workaround for people who are in desperate need: https://python.langchain.com/docs/use_cases/tool_use/prompting/

baswenneker commented 2 months ago

I have good faith this issue is fixed with this issue: https://github.com/langchain-ai/langchain/pull/20881

It is currently merged but not yet released, probably in a few days.

lalanikarim commented 2 weeks ago

Take a look at #22339 which should have addressed this issue. The PR was approved and merged yesterday but a release is yet to be cut from it and should happen in the next few days.

In the meantime, you may try and install langchain-experimental directly from langchain's source like this:

pip install git+https://github.com/langchain-ai/langchain.git\#egg=langchain-experimental\&subdirectory=libs/experimental

I hope this helps.