patterns-ai-core / langchainrb

Build LLM-powered applications in Ruby
https://rubydoc.info/gems/langchainrb
MIT License
1.44k stars 195 forks source link

Add support for parallel_tool_calls option when configuring Langchain::Assistant #813

Open andreibondarev opened 1 month ago

andreibondarev commented 1 month ago

Is your feature request related to a problem? Please describe. We'd like to enable better control of tool calling when using Langchain::Assistant. Some of the supported LLMs (Anthropic and OpenAI) let you modify whether parallel tool calls ("multiple tool calls") can be made or not. In some use-cases the Assistant must call tools sequentially hence we should be able to toggle that option on the Assistant instance.

Describe the solution you'd like Similar to tool_choice enable the developer to toggle:

assistant = Langchain::Assistant.new(parallel_tool_calls: true/false, ...)
assistant.parallel_tool_calls = true/false

Tasks

sergiobayona commented 1 week ago

I seems Google Gemini does support parallel function calling, see:

https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling#supported_models https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling#parallel-samples

andreibondarev commented 1 week ago

I seems Google Gemini does support parallel function calling, see:

https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling#supported_models https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling#parallel-samples

It does, but there's no way to configure whether functions can be called in parallel or not.