Open lfkellogg opened 4 months ago
I'm wondering if we should elevate this to a top level generate/model config. @mbleigh @pavelgj
Open AI, Anthropic, et al support the same:
https://platform.openai.com/docs/api-reference/chat/create#chat-create-tool_choice https://docs.anthropic.com/en/docs/tool-use#forcing-tool-use
Vertex AI:
https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling#tool-config
👍 anything shared across 3+ providers feels like it ought to be a built-in. Let's add it to the standard model config and add implementations in our plugins.
Is your feature request related to a problem? Please describe. The Gemini API supports setting a "mode" for function calling: https://ai.google.dev/gemini-api/docs/function-calling#function_calling_mode
For example this allows the developer to choose whether the LLM output should only contain function calls, or if the LLM should decide whether to return text or a function call.
The
googleai
plugin currently does not supporttool_config
. This should be added to the custom model config.Describe the solution you'd like The mode could be specified during the
generate()
call.Describe alternatives you've considered N/A
Additional context N/A