Open NPriyankaDS opened 6 days ago
Hi @NPriyankaDS, could you check if this works on the main branch (self-hosted version)? The Langflow team is updating the agents' structure to ensure broader compatibility with different frameworks. We're aiming to maintain support for Langchain Embedding models while converting them into the corresponding CrewAI LLM() object.
This setup should work with OpenAI in the flow you're demonstrating. If you prefer a workaround for the current release, you can modify the Sequential Crew component's code to set your OpenAI API Key as an environment variable.
Bug Description
I am using Datastax langflow to create a multi-agent systems flow using crewai. The model I am using is one of the models from ChatGroq. When I use the model alone, the flow works. But, when I use it with Sequential Crew, I am facing the below error:
raise OpenAIError( litellm.llms.OpenAI.openai.OpenAIError: Error code: 401 - {'error': {'message': 'Invalid API Key', 'type': 'invalid_request_error', 'code': 'invalid_api_key'}}
I had previously run the same flow locally before running it on datastax. It was working fine using the groq models with crewai. I had run using the groq api key not open ai api key.
Reproduction
Use the Sequential Task Crew and the sequential crew from Crewai modules and the language model as Groq with Groq API key.
Expected behavior
The flow needs to produce the output without any errors.
Who can help?
No response
Operating System
Windows 11
Langflow Version
Not sure
Python Version
None
Screenshot
No response
Flow File
No response