Closed Ashis-Palai closed 5 months ago
@Ashis-Palai Are you trying to call Gemini pro via the Google AI Studio api?
If so, change model name to
gemini/gemini-pro
@krrishdholakia Thank you so much for the workaround.
Later on, I found the issue with my Service account (we need to make sure it's active as we have to pass the project ID and location along with the request) and I got to know also how to use vertex ai Google models from below as well,
Thanks to @joshreini1
we are good to close this issue.
!pip install "litellm>=1.11.1"
from trulens_eval import LiteLLM, Feedback provider = LiteLLM(model_engine="gemini-pro")
from trulens_eval import TruLlama from trulens_eval import FeedbackMode
tru_recorder = TruLlama( sentence_window_engine, app_id="App_1", feedbacks=[ f_qa_relevance, f_qs_relevance, f_groundedness ] )
for question in eval_questions: with tru_recorder as recording: sentence_window_engine.query(question)
getting below error :
ERROR:trulens_eval.feedback.provider.endpoint.base:litellm request failed <class 'litellm.exceptions.BadRequestError'>=VertexAIException - Unable to find your project. Please provide a project ID by:
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.