joaomdmoura / crewAI

Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
https://crewai.com
MIT License
16.92k stars 2.28k forks source link

Azure openai sample code not working probably due to some error in the Azure openai API #563

Open ziki99 opened 2 months ago

ziki99 commented 2 months ago

I tried to run the sample code using Azure OpenAI API from https://github.com/joaomdmoura/crewAI-examples/blob/main/azure_model/main.py

I haved made one fix for the original code to include the expected_output='Detailed report on potential AI trends' which resolved the pydantic error

now i'm getting an error probably due to some issue with Azure openai API File "/home/ubuntu22/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request raise self._make_status_error_from_response(err.response) from None openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}

I have validated that I can connect to Azure openai API using the same parametrs stored in the .env file which i'm importing. I have used the following code for validation

import os from openai import AzureOpenAI from dotenv import load_dotenv load_dotenv()
client = AzureOpenAI( api_key=os.getenv("AZURE_OPENAI_KEY"),
api_version="2024-02-01", azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"), ) deployment_name=os.getenv("AZURE_OPENAI_DEPLOYMENT") response = client.chat.completions.create( model=deployment_name,

model = deployment_name,

messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Does Azure OpenAI support customer managed keys?"},
    {"role": "assistant", "content": "Yes, customer managed keys are supported by Azure OpenAI."},
    {"role": "user", "content": "Do other Azure AI services support this too?"}
]

) print(response.choices[0].message.content)`` import os from openai import AzureOpenAI from dotenv import load_dotenv load_dotenv()
client = AzureOpenAI( api_key=os.getenv("AZURE_OPENAI_KEY"),
api_version="2024-02-01", azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"), ) deployment_name=os.getenv("AZURE_OPENAI_DEPLOYMENT") response = client.chat.completions.create( model=deployment_name,

model = deployment_name,

messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Does Azure OpenAI support customer managed keys?"},
    {"role": "assistant", "content": "Yes, customer managed keys are supported by Azure OpenAI."},
    {"role": "user", "content": "Do other Azure AI services support this too?"}
]

) print(response.choices[0].message.content)

robertobalestri commented 2 months ago

I tried changing API version with "2024-02-15-preview" and I was able to resolve the problem. Try it.

load_dotenv()

azure_llm = AzureChatOpenAI( azure_endpoint= os.environ.get("AZURE_OPENAI_ENDPOINT"), api_key= os.environ.get("AZURE_OPENAI_KEY"), api_version= os.environ.get("API_VERSION"), deployment_name = os.environ.get("DEPLOYMENT_NAME"), )

ziki99 commented 1 month ago

Thank your for your reply, I have used your code and API version "2024-02-15-preview", i'm still getting the same error

Entering new CrewAgentExecutor chain... Traceback (most recent call last): File "/home/ubuntu22/projects/mycrewai/AzureGitEXample.py", line 61, in tech_crew.kickoff() File "/home/ubuntu22/.local/lib/python3.10/site-packages/crewai/crew.py", line 252, in kickoff result = self._run_sequential_process() File "/home/ubuntu22/.local/lib/python3.10/site-packages/crewai/crew.py", line 293, in _run_sequential_process output = task.execute(context=task_output) File "/home/ubuntu22/.local/lib/python3.10/site-packages/crewai/task.py", line 173, in execute result = self._execute( File "/home/ubuntu22/.local/lib/python3.10/site-packages/crewai/task.py", line 182, in _execute result = agent.execute_task( File "/home/ubuntu22/.local/lib/python3.10/site-packages/crewai/agent.py", line 221, in execute_task result = self.agent_executor.invoke( File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain/chains/base.py", line 163, in invoke raise e File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain/chains/base.py", line 153, in invoke self._call(inputs, run_manager=run_manager) File "/home/ubuntu22/.local/lib/python3.10/site-packages/crewai/agents/executor.py", line 124, in _call next_step_output = self._take_next_step( File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain/agents/agent.py", line 1138, in _take_next_step [ File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain/agents/agent.py", line 1138, in [ File "/home/ubuntu22/.local/lib/python3.10/site-packages/crewai/agents/executor.py", line 186, in _iter_next_step output = self.agent.plan( File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain/agents/agent.py", line 397, in plan for chunk in self.runnable.stream(inputs, config={"callbacks": callbacks}): File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2875, in stream yield from self.transform(iter([input]), config, kwargs) File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2862, in transform yield from self._transform_stream_with_config( File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1881, in _transform_stream_with_config chunk: Output = context.run(next, iterator) # type: ignore File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2826, in _transform for output in final_pipeline: File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1282, in transform for ichunk in input: File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 4736, in transform yield from self.bound.transform( File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1300, in transform yield from self.stream(final, config, kwargs) File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 249, in stream raise e File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 229, in stream for chunk in self._stream(messages, stop=stop, kwargs): File "/home/ubuntu22/.local/lib/python3.10/site-packages/langchain_openai/chat_models/base.py", line 408, in _stream for chunk in self.client.create(messages=message_dicts, params): File "/home/ubuntu22/.local/lib/python3.10/site-packages/openai/_utils/_utils.py", line 277, in wrapper return func(*args, **kwargs) File "/home/ubuntu22/.local/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 579, in create return self._post( File "/home/ubuntu22/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1240, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "/home/ubuntu22/.local/lib/python3.10/site-packages/openai/_base_client.py", line 921, in request return self._request( File "/home/ubuntu22/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request raise self._make_status_error_from_response(err.response) from None openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}