run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.41k stars 4.99k forks source link

[Feature Request]: Upgrade Gemini LLM to be function calling for select Gemini LLMs #14993

Open tituslhy opened 1 month ago

tituslhy commented 1 month ago

Feature Description

According to Google, some of their Gemini models have function calling. In LlamaIndex this is set to False.

Gemini link: https://ai.google.dev/gemini-api/docs/function-calling LlamaIndex code:

from llama_index.llms.gemini import Gemini

llm = Gemini(model = "models/gemini-1.5-pro-latest")
llm.metadata

This prints the cell:

LLMMetadata(context_window=2105344, num_output=8192, is_chat_model=True, is_function_calling_model=False, model_name='models/gemini-1.5-pro-latest', system_role=<MessageRole.SYSTEM: 'system'>)

Reason

N/A

Value of Feature

Gemini is currently free of charge (but rate limited) so having function calling would be a great help for quick prototyping.

logan-markewich commented 1 month ago

I welcome a contribution for this! I looked into it once and tbh found geminis api pretty confusing to implement the way that llama-index requires

Vertexai in llama-indsx supports function calling though (so you can access gemini that way if you are using vertexai)