run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
37.08k stars 5.32k forks source link

[Bug]: Gemini not working with llama-index #15600

Closed Shiva4113 closed 3 months ago

Shiva4113 commented 3 months ago

Bug Description

I was attempting to use Gemini using the examples in the official documentation:

I kept encountering the following error message stated in the logs when I executed the following lines of code:

from llama_index.llms.gemini import Gemini
resp = Gemini().complete("Write a poem about a magic backpack")
print(resp)

TypeError: Expected: str, Model, or TunedModel" is the error I encountered.

Version

0.11

Steps to Reproduce

You will need to create and use a google api key for gemini

from llama_index.core import SimpleDirectoryReader, VectorStoreIndex
import os 
from dotenv import load_dotenv
from llama_index.embeddings.gemini import GeminiEmbedding
from llama_index.core import Settings
from llama_index.core.node_parser import TokenTextSplitter
from llama_index.llms.gemini import Gemini

load_dotenv()

GOOGLE_API_KEY = os.getenv('GEMINI_API_KEY')
os.environ["GOOGLE_API_KEY"] = GOOGLE_API_KEY

resp = Gemini().complete("Write a poem about a magic backpack")
print(resp)

Relevant Logs/Tracbacks

{
    "name": "TypeError",
    "message": "Expected: str, Model, or TunedModel",
    "stack": "---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[9], line 1
----> 1 resp = Gemini().complete(\"Write a poem about a magic backpack\")
      2 print(resp)

File c:\\venv\\Lib\\site-packages\\llama_index\\llms\\gemini\\base.py:147, in Gemini.__init__(self, api_key, model, temperature, max_tokens, generation_config, safety_settings, callback_manager, api_base, transport, model_name, default_headers, **generate_kwargs)
    139 final_gen_config = {\"temperature\": temperature, **base_gen_config}
    141 model = genai.GenerativeModel(
    142     model_name=model,
    143     generation_config=final_gen_config,
    144     safety_settings=safety_settings,
    145 )
--> 147 model_meta = genai.get_model(model)
    149 supported_methods = self._model_meta.supported_generation_methods
    150 if \"generateContent\" not in supported_methods:

File c:\\venv\\Lib\\site-packages\\google\\generativeai\\models.py:52, in get_model(name, client, request_options)
     30 def get_model(
     31     name: model_types.AnyModelNameOptions,
     32     *,
     33     client=None,
     34     request_options: dict[str, Any] | None = None,
     35 ) -> model_types.Model | model_types.TunedModel:
     36     \"\"\"Given a model name, fetch the `types.Model` or `types.TunedModel` object.
     37 
     38     
   (...)
     50         A `types.Model` or `types.TunedModel` object.
     51     \"\"\"
---> 52     name = model_types.make_model_name(name)
     53     if name.startswith(\"models/\"):
     54         return get_base_model(name, client=client, request_options=request_options)

File c:\\venv\\Lib\\site-packages\\google\\generativeai\\types\\model_types.py:350, in make_model_name(name)
    348     name = name
    349 else:
--> 350     raise TypeError(\"Expected: str, Model, or TunedModel\")
    352 if not (name.startswith(\"models/\") or name.startswith(\"tunedModels/\")):
    353     raise ValueError(f\"Model names should start with `models/` or `tunedModels/`, got: {name}\")

TypeError: Expected: str, Model, or TunedModel"

}
dosubot[bot] commented 3 months ago

@Shiva4113 Hello! I'm here to help you with any issues, answer your questions, and guide you to become a contributor. Let's tackle this bug together!

I found a similar closed issue that might help you resolve the TypeError: "Expected: str, Model, or TunedModel" error. The issue is related to the model_name parameter you pass to the Gemini class. Ensure that the model_name parameter is a string representing a valid model name.

Here is an example of how to correctly instantiate and use the Gemini class:

from llama_index.llms.gemini import Gemini

# Ensure you pass a valid model name as a string
llm = Gemini(model="models/gemini-1.5-flash", api_key="YOUR_API_KEY")

# Use the complete method to generate text
resp = llm.complete("Write a poem about a magic backpack")
print(resp)

In this example, "models/gemini-1.5-flash" is a valid model name string. Make sure that the model_name parameter is not None and is one of the expected types. The valid model names are defined in the GEMINI_MM_MODELS tuple, such as "models/gemini-pro-vision", "models/gemini-ultra-vision", and "models/gemini-1.5-flash" [1][2][3][4].

Additionally, you can refer to the Gemini model site to find the models that are currently available and their capabilities. You can also use the API to list suitable models:

import google.generativeai as genai

for m in genai.list_models():
    if "generateContent" in m.supported_generation_methods:
        print(m.name)

This will help you ensure that the model name you are using is valid and supported.

To continue talking to Dosu, mention @dosu.

Shiva4113 commented 3 months ago

Upon running this code:

from llama_index.llms.gemini import Gemini

# Ensure you pass a valid model name as a string
llm = Gemini(model="models/gemini-1.5-flash", api_key=GOOGLE_API_KEY)

# Use the complete method to generate text
resp = llm.complete("Write a poem about a magic backpack")
print(resp)

I still get the same error.

TypeError: Expected: str, Model, or TunedModel

dosubot[bot] commented 3 months ago

The error "TypeError: Expected: str, Model, or TunedModel" is likely occurring because the model parameter is not being passed correctly or is not of the expected type when initializing the Gemini class. Ensure that the model parameter is a valid string from the predefined GEMINI_MODELS.

Here is an example of how to correctly initialize the Gemini class:

from llama_index.llms.gemini import Gemini

# Correct initialization
gemini_model = Gemini(
    api_key=GOOGLE_API_KEY,
    model="models/gemini-1.5-flash",  # Ensure this is a valid string from GEMINI_MODELS
    temperature=0.7,
    max_tokens=100
)

# Use the complete method to generate text
resp = gemini_model.complete("Write a poem about a magic backpack")
print(resp)

Make sure that the model you are passing is one of the following valid options defined in GEMINI_MODELS:

Ensure that the GOOGLE_API_KEY is correctly set and accessible in your environment.

To continue talking to Dosu, mention @dosu.