Open baswenneker opened 2 months ago
Thank you! I appreciate your comment on the ollama github.
I haven't tried yet, but this could be a workaround for people who are in desperate need: https://python.langchain.com/docs/use_cases/tool_use/prompting/
I have good faith this issue is fixed with this issue: https://github.com/langchain-ai/langchain/pull/20881
It is currently merged but not yet released, probably in a few days.
Take a look at #22339 which should have addressed this issue. The PR was approved and merged yesterday but a release is yet to be cut from it and should happen in the next few days.
In the meantime, you may try and install langchain-experimental
directly from langchain's source like this:
pip install git+https://github.com/langchain-ai/langchain.git\#egg=langchain-experimental\&subdirectory=libs/experimental
I hope this helps.
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
I'm trying to get Function Calls working with
OllamaFunctions
. I tried this with several different models btw, mistral, llama3, dolphincoder, mixtral:8x22b. It will always respond with:I've found these issues as well that might be related: https://github.com/langchain-ai/langchain/issues/14360 https://github.com/langchain-ai/langchain/pull/20881
I found that OllamaFunctions returns a
FunctionMessage
but_convert_messages_to_ollama_messages
from OllamaChat doesn't recognize that and is not able to call the function.https://github.com/langchain-ai/langchain/blob/4c437ebb9c2fb532ce655ac1e0c354c82a715df7/libs/community/langchain_community/chat_models/ollama.py#L99
Any help would be greatly appreciated.
System Info