Open ANPCI opened 1 month ago
Hi
Can you try this?
from crewai import Agent, LLM, Task, Crew, Process
llm = LLM(
model="gemini/gemini-pro", temperature=0.9,
api_key=API_KEY,
)
For reference you can also check this screenshot
And the rest of your code:
I also cannot use Gemini with VertexAI in a crew, but it works using the LiteLLM
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
any updates?
Description
Crewai generating error when using Gemini pro api, while it's working fine with other openai models.
Steps to Reproduce
add the script to test.py and run it with poetry run python test.py
Expected behavior
it should access gemini api to generate content
Screenshots/Code snippets
import os from langchain_google_genai import ChatGoogleGenerativeAI from crewai import Agent, Task, Crew, Process
if name == "main":
Operating System
macOS Sonoma
Python Version
3.10
crewAI Version
0.63.6
crewAI Tools Version
0.12.1
Virtual Environment
Poetry
Evidence
Error Message in terminal :
Provider List: https://docs.litellm.ai/docs/providers
2024-10-03 16:03:06,390 - 8480485952 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
Possible Solution
None
Additional context
Tried running the script multiple times, same issue