crewAIInc / crewAI

Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
https://crewai.com
MIT License
21.27k stars 2.96k forks source link

[BUG]Error with memory=True using AzureOpenAI Provider #1368

Open morhyak opened 1 month ago

morhyak commented 1 month ago

Description

Hi, I encountered an issue when using the AzureOpenAI provider. Setting memory=True results in a parameters error because it doesn't allow defining the embedder with additional parameters beyond model_name and provider. In my configuration for Azure embeddings, I need to include all parameters, including SSO access details. For example, the setup requires something like the following configuration:

"model": EMBEDDING_MODEL,
            "deployment": EMBEDDING_MODEL ,
            "openai_api_key":API_KEY,
            "azure_endpoint": API_BASE,
            "openai_api_version":API_VERSION,
            "openai_api_type":TYPE,
            "default_headers": {
                "Authorization": f"Bearer {access_token}",
                "Content-Type": "application/json"

Please let me know if there's a workaround or fix for this issue. Thanks

Steps to Reproduce

crew = Crew(
    agents=[support_agent, support_quality_assurance_agent],
    tasks=[inquiry_resolution, quality_assurance_review],
    verbose=True,
    memory=True,
    llm=llm,
    embedder={
        "provider": "azure_openai",
        "config": {
            "model": EMBEDDING_MODEL,
            "deployment": EMBEDDING_MODEL ,
            "openai_api_key":API_KEY,
            "azure_endpoint": API_BASE,
            "openai_api_version":API_VERSION,
            "openai_api_type":TYPE,
            "default_headers": {
                "Authorization": f"Bearer {access_token}",
                "Content-Type": "application/json"
            }
        }})

Expected behavior

I expected the crew to run successfully after being built using crew.kickoff.

Screenshots/Code snippets


Operating System

Windows 10

Python Version

3.11

crewAI Version

0.55.2

crewAI Tools Version

0.12.1

Virtual Environment

Venv

Evidence

AuthenticationError: Error code: 401 - {'statusCode': 401, 'message': 'Invalid Authorization token'}

Possible Solution

I attempted to modify the source code by adding additional parameter inputs for the embedder in misc.py and embedder-base.py. However, when running the kickoff, I still encounter an error related to SSO. AuthenticationError: Error code: 401 - {'statusCode': 401, 'message': 'Invalid Authorization token'}

Additional context

-

voytas75 commented 1 month ago

first start with updated crewAI if you can:

pip install --upgrade crewai crewai-tools

CrewAi now uses LiteLLM

embedder for AOAI:

    embedder={
        "provider": "azure_openai",
        "config":{
            "model": "<model>",
            "deployment_name": "<dep name>",
        },        
    },  

And delete "OPENAI_API_BASE" env if exists: https://github.com/langchain-ai/langchain/discussions/17790#discussioncomment-8690960

voytas75 commented 1 month ago

... and prepare env, example:

os.environ.update({
    "AZURE_API_KEY": os.getenv("AZURE_OPENAI_API_KEY"),
    "AZURE_API_BASE": os.getenv("AZURE_OPENAI_ENDPOINT"),
    "AZURE_API_VERSION": os.getenv("AZURE_OPENAI_API_VERSION"),
})
morhyak commented 1 month ago

Hi, thanks for your response. However, the SSO issue wasn't addressed. I need to define the Authorization tokens as follows:

  embedder={
        "provider": "azure_openai",
        "config": {
            "model": EMBEDDING_MODEL,
            "deployment": EMBEDDING_MODEL ,
            "openai_api_key":API_KEY,
            "azure_endpoint": API_BASE,
            "openai_api_version":API_VERSION,
            "openai_api_type":TYPE,
            "default_headers": {
                "Authorization": f"Bearer {access_token}",
                "Content-Type": "application/json"
            }
        }})
voytas75 commented 1 month ago

Double check if you have azure_deployment in embedder config. error points that you use wrong arg name azure_deployment

morhyak commented 1 month ago

After configuring all the parameters, I encountered an error related to the SSO issue.

AuthenticationError: Error code: 401 - {'statusCode': 401, 'message': 'Invalid Authorization token'}

voytas75 commented 1 month ago

API_BASE is OK (https://YOUR_RESOURCE_NAME.openai.azure.com/)? Invalid Authorization token probably come from OpenAI. Azure OpenAI has another text Access denied due to invalid subscription key. Make sure to provide a valid key for an active subscription.

sorin-costea commented 1 month ago

Shouldn't the Azure OpenAI base be like https://YOUR_ZONE.api.cognitive.microsoft.com/ ?

Edit: also the subscription key is an error when your computer is logged in to AD with another account/subscription than the Azure, check the documentation about "az login --use-device-code"

voytas75 commented 1 month ago

No, https://learn.microsoft.com/en-us/azure/ai-services/openai/reference and Azure Cognitive Services APIs is other service. I do not think crewai by itself or liteLLM support Cognitive.

morhyak commented 1 month ago

All the parameter configurations are correct (I double-checked them in another process). However, after adding extra options to the parameters in the config(for the sso tokens) , I am now encountering the following error:

ConnectError: [Errno 11001] getaddrinfo failed

httplups commented 1 month ago

Hi, I have the same error using a different env. I am using Gemini in VertexAI from litellm. My crew runs with memory=False. When I set the memory to True, I have this error: AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: fake. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

I am doing:

crew = Crew( agents=[support_agent, support_quality_assurance_agent], tasks=[inquiry_resolution, quality_assurance_review], verbose=True, memory=True )

sorin-costea commented 1 month ago

No, https://learn.microsoft.com/en-us/azure/ai-services/openai/reference and Azure Cognitive Services APIs is other service. I do not think crewai by itself or liteLLM support Cognitive.

LiteLLM says it supports the cognitive endpoints already (see https://github.com/BerriAI/litellm/discussions/5995), and (as far as I can tell) the only way to use Azure OpenAI is with this Cognitive regional service endpoint. The resource-based endpoints are called legacy and less and less regions support them. I wasn't able to find a region supporting them but also didn't try with every region...

PS: could be all it needs is dependencies versions bump?

PPS: I get it running fine like this, so I guess the AzureChatOpenAI has something:

azure_llm = LLM(
    model="azure/deployment_name",
    base_url=regional_complete_deployment_url,
    api_key=api_key
)
voytas75 commented 1 month ago

ok, maybe they are cooking :) In their docs https://docs.litellm.ai/docs/providers/azure there is no info about ...cognitive.microsoft.com... so does your code using cognitive.microsoft.com work on LiteLLM?

sorin-costea commented 1 month ago

@voytas75 yes it works fine like in the example above. Only langchain's component doesn't, but I must say because its version is pinned by crewai I didn't try their latest - maybe it works in the meantime there too.

PS and you're right, nobody got that far to also document those new URLs.

github-actions[bot] commented 2 days ago

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.