BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.62k stars 1.2k forks source link

Gemini-PRO not working as Provider #1816

Closed Ashis-Palai closed 5 months ago

Ashis-Palai commented 5 months ago

!pip install "litellm>=1.11.1"

from trulens_eval import LiteLLM, Feedback provider = LiteLLM(model_engine="gemini-pro")

from trulens_eval import TruLlama from trulens_eval import FeedbackMode

tru_recorder = TruLlama( sentence_window_engine, app_id="App_1", feedbacks=[ f_qa_relevance, f_qs_relevance, f_groundedness ] )

for question in eval_questions: with tru_recorder as recording: sentence_window_engine.query(question)

getting below error :

ERROR:trulens_eval.feedback.provider.endpoint.base:litellm request failed <class 'litellm.exceptions.BadRequestError'>=VertexAIException - Unable to find your project. Please provide a project ID by:

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

krrishdholakia commented 5 months ago

@Ashis-Palai Are you trying to call Gemini pro via the Google AI Studio api?

If so, change model name to

gemini/gemini-pro

Docs - https://litellm.vercel.app/docs/providers/gemini

Ashis-Palai commented 5 months ago

@krrishdholakia Thank you so much for the workaround.

Later on, I found the issue with my Service account (we need to make sure it's active as we have to pass the project ID and location along with the request) and I got to know also how to use vertex ai Google models from below as well,

https://github.com/truera/trulens/blob/main/trulens_eval/examples/expositional/models/google_vertex_quickstart.ipynb

Thanks to @joshreini1

we are good to close this issue.