Closed ppk5 closed 2 months ago
Referred https://joaomdmoura.github.io/crewAI/how-to/LLM-Connections/#azure-open-ai Got the the llm object that responds to the humanMessage but when I use with agents, I am getting this error.
/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 242, in stream
assert generation is not None
AssertionError
Below snap shows that the llm object responds to Human messages
LangChain Core is having a branching in chatmodels.py which causes the code to fail when the AzureChatOpenAI object does not support streaming.
Reported the issue upstream https://github.com/langchain-ai/langchain/issues/16930
I am also dealing with the Azure Chat OpenAI issue for a company project. Let me see if I can help. Please simplify your code to the bare minimum and make sure that calling OpenAI directly works properly with the entire CrewAI crew.
hi, according to https://api.python.langchain.com/en/latest/agents/langchain.agents.agent.RunnableAgent.html, runnableAgent has a stream_runnable option. it is possible to not use stream modifying crewai/agent.py on line 280 from
self.agent_executor = CrewAgentExecutor(
agent=RunnableAgent(runnable=inner_agent), **executor_args
)
to
self.agent_executor = CrewAgentExecutor(
agent=RunnableAgent(runnable=inner_agent, stream_runnable=False), **executor_args
)
it allows to run crewAi with a custom LLM without " assert generation is not None" error . tested with https://github.com/scenaristeur/igora
With that fix , i don't have the
assert generation is not None
AssertionError
but the agent does not output the result yet
I have also come across this issue when trying to use an Azure LLM instance that cannot stream as it sits behind an application gateway, looking forward to a resolution. @ppk5 I assume from your message above the fix will rely on an upstream langchain fix to be in place first, thank you for being proactive and raising the issue with them. If anyone has figured out a workaround meanwhile to get CrewAi working in this situation please let us know :)
Still hitting this issue as of the end of May 2024.
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.
We have an instance of Azure OpenAI but when using that as the llm, I see generation assert error. But,
llm([message])
works Is there any sort of workaround to get Azure OpenAI to work ?Attaching the screenshot of my crew.py
Here is my AzureOpenAI.py
In Docs I could see that Azure hosted OpenAI endpoints are supported.
Any suggestions on this @joaomdmoura ?