run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.86k stars 5.09k forks source link

[Bug]: Gemini Ultra Supported? #13008

Closed anniedoris closed 4 months ago

anniedoris commented 5 months ago

Bug Description

When I try to create:

gemini_pro = GeminiMultiModal(model_name="models/gemini-ultra-vision")

I get the following error:

google.api_core.exceptions.NotFound: 404 Model is not found: models/gemini-ultra-vision for api version v1beta

Do you support Gemini Ultra?

Version

0.10.30

Steps to Reproduce

import os
from llama_index.multi_modal_llms.gemini import GeminiMultiModal
from llama_index.core.multi_modal_llms.generic_utils import load_image_urls
from llama_index.core import SimpleDirectoryReader
from llama_index.core import StorageContext, load_index_from_storage
from llama_index.embeddings.openai import OpenAIEmbedding

gemini_pro = GeminiMultiModal(model_name="models/gemini-ultra-vision")

Relevant Logs/Tracbacks

No response

dosubot[bot] commented 5 months ago

It looks like there was a slight mistake in the model name you used. The correct model name for Gemini Ultra should be "models/gemini-ultra", not "models/gemini-ultra-vision". Here's how you should initialize the Gemini model:

gemini_pro = GeminiMultiModal(model_name="models/gemini-ultra")

This adjustment should resolve the NotFound error you encountered. For further details, you might want to check the documentation or the source code of the LlamaIndex repository to ensure you're using the latest supported models and features.

#### Details **Sources** - [llama-index-integrations/multi_modal_llms/llama-index-multi-modal-llms-gemini/llama_index/multi_modal_llms/gemini/base.py](https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/multi_modal_llms/llama-index-multi-modal-llms-gemini/llama_index/multi_modal_llms/gemini/base.py) - [llama-index-integrations/llms/llama-index-llms-gemini/llama_index/llms/gemini/base.py](https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/llms/llama-index-llms-gemini/llama_index/llms/gemini/base.py) - [llama-index-legacy/llama_index/legacy/multi_modal_llms/gemini.py](https://github.com/run-llama/llama_index/blob/main/llama-index-legacy/llama_index/legacy/multi_modal_llms/gemini.py)
**About Dosu** This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

logan-markewich commented 5 months ago

We support it, but depends on your access and installed client version (this response is coming directly from Google 🤷🏻 )

logan-markewich commented 5 months ago

There's an example here (if you want to leverage the multimodal capabilities, you need the multi moda llm class for now)

https://docs.llamaindex.ai/en/stable/examples/multi_modal/gemini/?h=gemini

You can also use googles api to list the models available to you

https://ai.google.dev/api/python/google/generativeai/list_models

anniedoris commented 5 months ago

Ah ok thanks, I guess this means I need to be paying for Gemini Advanced to access Ultra?

logan-markewich commented 5 months ago

Maybe? Haha I'm not fully aware of how googles api and access works (just pointing to examples I was aware of 😅)

anniedoris commented 5 months ago

Would it be possible to get support for Gemini model gemini-1.5-pro-latest? I believe this is different

logan-markewich commented 5 months ago

I think once again, it's supported assuming you can access the model over the Google client

anniedoris commented 5 months ago

The reason I'm a bit confused is because on Gemini's website, there is a list of their current models. I see that models/gemini-pro-vision is one of their current models, and this is one that is supported by Llamaindex. However, the other model supported by Llamaindex models/gemini-ultra-vision is not on Gemini's current list of models. Is there a way I can access gemini-1.5-pro-latest through llamaindex? I don't think this is the same as ultra?

seldo commented 4 months ago

1.5-pro-latest and ultra support landed if you have access, so I think we can close this?