davidmigloz / langchain_dart

Build LLM-powered Dart/Flutter applications.
https://langchaindart.dev
MIT License
426 stars 75 forks source link

feat: Add ToolsAgent for models with tool-calling support #530

Closed Heinrich-vanNieuwenhuizen closed 2 months ago

Heinrich-vanNieuwenhuizen commented 2 months ago

Description

This pull request introduces a new ToolsAgent that simplifies the integration of tools into agents powered by language models such as Ollama and OpenAI. The agent supports tool-driven workflows, enhancing the flexibility of LangChain agents.

Dependencies

Testing

@davidmigloz & @Ganeshsivakumar

davidmigloz commented 2 months ago

@Heinrich-vanNieuwenhuizen Let me know if you have any comments on my changes, otherwise I'll merge it!

Heinrich-vanNieuwenhuizen commented 2 months ago
  • a3.1, as it's a more powerful model than llama3-groq-tool-use for most use ca

Hi @davidmigloz , do your tests pass with llama3.1 mine only works with llama3-groq-tool-use. I

I had mixed results without "Removing the custom tool-calling prompt, as the providers already handle this. "

davidmigloz commented 2 months ago

do your tests pass with llama3.1 mine only works with llama3-groq-tool-use?

I left llama3-groq-tool-use for the tools_test.dart. Although with llama3.1 all the tests pass except the "Test ToolsAgent with messages memory" test, which is a known limitation of llama3.1:

We recommend using Llama 70B-instruct or Llama 405B-instruct for applications that combine conversation and tool calling. Llama 8B-Instruct can not reliably maintain a conversation alongside tool calling definitions. It can be used for zero-shot tool calling, but tool instructions should be removed for regular conversations between the model and the user.

I had mixed results without the custom tool-calling prompt

If we add that custom prompt, the actual prompt that arrives to the model will contain two tools definitions. The one from our custom prompt plus the tool calling prompt that Ollama adds (or the other providers). So that will probably confuse the model (and add many unnecessary input tokens). The custom prompt should only be necessary for models that don't support tool calling. In that case, the user can provide a custom system prompt:

final agent = ToolsAgent.fromLLMAndTools(
  llm: llm,
  systemChatMessage: <their custom prompt>,
);
Heinrich-vanNieuwenhuizen commented 2 months ago

@Heinrich-vanNieuwenhuizen Let me know if you have any comments on my changes, otherwise I'll merge it! @davidmigloz Okay looks good you can merge

davidmigloz commented 2 months ago

Merged! Thanks again for your contribution @Heinrich-vanNieuwenhuizen 🙌 I'll release it this week.

davidmigloz commented 2 months ago

I've tested it with mistral-nemo, and it's also working pretty well.