langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
95.63k stars 15.52k forks source link

Feature request: Vertex AI Function Calling #15073

Closed lawwu closed 8 months ago

lawwu commented 11 months ago

Feature request

Google's gemini-pro supports function calling. It would be nice to be able to use langchain to support function calling when using the VertexAI class similar to OpenAI and OpenAI's version of function calling: https://cloud.google.com/vertex-ai/docs/generative-ai/multimodal/function-calling

Motivation

Here's a notebook where to access this functionality you have to use the vertexai library directly which means we lose the langchain standardization of input and output schemas: https://github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/function-calling/intro_function_calling.ipynb

Your contribution

Possibly can help.

dosubot[bot] commented 11 months ago

🤖

Hello @lawwu! I'm Dosu, a friendly bot here to assist you while we wait for a human maintainer. I'm here to help with troubleshooting, answering your queries, and guiding your journey to becoming a contributor. Nice to meet you!

Thank you for your feature request. As of the current version of LangChain (v0.0.352), there is no support for function calling similar to OpenAI's version. The framework does have integration with Google's gemini-pro model through the google.generativeai library, but there is no mention or usage of the vertexai library.

The GoogleGenerativeAI class in LangChain does have a model field which can be set to "gemini-pro", and the _generate and _stream methods have special handling for Gemini models. However, these methods do not currently support function calling.

Here is the relevant code:

class GoogleGenerativeAI(BaseLLM, BaseModel):
    """Google GenerativeAI models.

    Example:
        .. code-block:: python

            from langchain_google_genai import GoogleGenerativeAI
            llm = GoogleGenerativeAI(model="gemini-pro")
    """

    model: str = Field(
        ...,
        description="""The name of the model to use.
Supported examples:
    - gemini-pro
    - models/text-bison-001""",
    )

    @property
    def is_gemini(self) -> bool:
        """Returns whether a model is belongs to a Gemini family or not."""
        return _is_gemini_model(self.model)

    def _generate(
        self,
        prompts: List[str],
        stop: Optional[List[str]] = None,
        run_manager: Optional[CallbackManagerForLLMRun] = None,
        **kwargs: Any,
    ) -> LLMResult:
        for prompt in prompts:
            if self.is_gemini:
                res = _completion_with_retry(
                    self,
                    prompt=prompt,
                    stream=False,
                    is_gemini=True,
                    run_manager=run_manager,
                    generation_config=generation_config,
                )
    def _stream(
        self,
        prompt: str,
        stop: Optional[List[str]] = None,
        run_manager: Optional[CallbackManagerForLLMRun] = None,
        **kwargs: Any,
    ) -> Iterator[GenerationChunk]:
        for stream_resp in _completion_with_retry(
            self,
            prompt,
            stream=True,
            is_gemini=True,
            run_manager=run_manager,
            generation_config=generation_config,
            **kwargs,
        ):

Sources:

Your feature request for function calling support when using the VertexAI class is a valuable suggestion. It would indeed be beneficial to have LangChain standardize the input and output schemas for this functionality. However, as a technical support rep, I'm unable to create pull requests or issues. I would recommend reaching out to the LangChain developers directly with your feature request.

Your offer to possibly help with this feature is greatly appreciated. Contributions from users like you are what help open-source projects like LangChain thrive and improve. If you have any further questions or need more information, please don't hesitate to ask.

Best regards, Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.