Codium-ai / pr-agent

🚀CodiumAI PR-Agent: An AI-Powered 🤖 Tool for Automated Pull Request Analysis, Feedback, Suggestions and More! 💻🔍
Apache License 2.0
5.62k stars 518 forks source link

Azure OpenAI implementation is mostly returning a weird 401 (Unauthorized. Access token is missing...), BUT sometimes works fine #1169

Closed DeviousM closed 3 weeks ago

DeviousM commented 3 weeks ago

Hi, I am using the Bitbucket Cloud integration in the Bitbucket pipeline. Everything configured "by the book" and it feels like it should be working fine, but we're getting an openai.AuthenticationError: Error code: 401 - {'statusCode': 401, 'message': 'Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.'} error for some reason.

We're getting this error log (almost exactly the same every single time) in most of our reviews, however some of them pass fine, which is even weirder as it's not a configuration error then I suppose:

2024-08-23 07:48:41.699 | INFO     | pr_agent.git_providers.utils:apply_repo_settings:41 - Applying repo settings:
{'OPENAI': {'key': '$OPENAI_API_KEY', 'api_type': 'azure', 'api_version': '2024-04-01-preview', 'api_base': 'https://$OPEN_AI_ENDPOINT_NAME.openai.azure.com', 'deployment_id': 'gpt-4o'}, 'CONFIG': {'model': 'gpt-4o', 'model_turbo': 'gpt-4-turbo'}}
2024-08-23 07:48:43.755 | INFO     | pr_agent.tools.pr_reviewer:run:111 - Reviewing PR: https://bitbucket.org/repo-name/org-name/pull-requests/1234 ...
2024-08-23 07:48:44.178 | INFO     | pr_agent.algo.pr_processing:get_pr_diff:63 - PR main language: Other
2024-08-23 07:48:44.184 | INFO     | pr_agent.algo.pr_processing:get_pr_diff:74 - Tokens: 1162, total tokens under limit: 32000, returning full diff.
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
2024-08-23 07:48:44.763 | WARNING  | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference: 
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
2024-08-23 07:48:45.251 | WARNING  | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference: 
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
2024-08-23 07:48:45.774 | WARNING  | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference: 
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
2024-08-23 07:48:46.369 | WARNING  | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference: 
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

2024-08-23 07:48:46.944 | WARNING  | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference: 
2024-08-23 07:48:46.951 | WARNING  | pr_agent.algo.pr_processing:retry_with_fallback_models:331 - Failed to generate prediction with gpt-4o from deployment gpt-4o: Traceback (most recent call last):
  File "/usr/local/lib/python3.12/site-packages/litellm/main.py", line 401, in acompletion
    response = await init_response
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/litellm/llms/azure.py", line 779, in acompletion
    raise e
  File "/usr/local/lib/python3.12/site-packages/litellm/llms/azure.py", line 731, in acompletion
    headers, response = await self.make_azure_openai_chat_completion_request(
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/litellm/llms/azure.py", line 485, in make_azure_openai_chat_completion_request
    raise e
  File "/usr/local/lib/python3.12/site-packages/litellm/llms/azure.py", line 477, in make_azure_openai_chat_completion_request
    raw_response = await azure_client.chat.completions.with_raw_response.create(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_legacy_response.py", line 367, in wrapped
    return cast(LegacyAPIResponse[R], await func(*args, **kwargs))
                                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 1339, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1815, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1509, in request
    return await self._request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1610, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'statusCode': 401, 'message': 'Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.'}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/usr/local/lib/python3.12/site-packages/tenacity/_asyncio.py", line 50, in __call__
    result = await fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/pr_agent/algo/ai_handlers/litellm_ai_handler.py", line 216, in chat_completion
    response = await acompletion(**kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/litellm/utils.py", line 1579, in wrapper_async
    raise e
  File "/usr/local/lib/python3.12/site-packages/litellm/utils.py", line 1399, in wrapper_async
    result = await original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/litellm/main.py", line 424, in acompletion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/litellm/utils.py", line 8301, in exception_type
    raise e
  File "/usr/local/lib/python3.12/site-packages/litellm/utils.py", line 8087, in exception_type
    raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AzureException AuthenticationError - Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
  File "/app/pr_agent/algo/pr_processing.py", line 329, in retry_with_fallback_models
    return await f(model)
           ^^^^^^^^^^^^^^
  File "/app/pr_agent/tools/pr_reviewer.py", line 163, in _prepare_prediction
    self.prediction = await self._get_prediction(model)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/pr_agent/tools/pr_reviewer.py", line 185, in _get_prediction
    response, finish_reason = await self.ai_handler.chat_completion(
                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped
    return await fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/tenacity/_asyncio.py", line 47, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/tenacity/__init__.py", line 326, in iter
    raise retry_exc from fut.exception()
tenacity.RetryError: RetryError[<Future at 0x7ff06c5de810 state=finished raised AuthenticationError>]
2024-08-23 07:48:46.970 | INFO     | pr_agent.algo.pr_processing:get_pr_diff:63 - PR main language: Other
2024-08-23 07:48:46.975 | INFO     | pr_agent.algo.pr_processing:get_pr_diff:74 - Tokens: 1162, total tokens under limit: 32000, returning full diff.
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
2024-08-23 07:48:47.486 | WARNING  | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference: 
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
2024-08-23 07:48:47.970 | WARNING  | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference: 
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
2024-08-23 07:48:48.464 | WARNING  | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference: 
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
2024-08-23 07:48:48.964 | WARNING  | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference: 
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
2024-08-23 07:48:49.446 | WARNING  | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference: 
2024-08-23 07:48:49.452 | WARNING  | pr_agent.algo.pr_processing:retry_with_fallback_models:331 - Failed to generate prediction with gpt-4o-2024-05-13 from deployment gpt-4o: Traceback (most recent call last):
  File "/usr/local/lib/python3.12/site-packages/litellm/main.py", line 401, in acompletion
    response = await init_response
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/litellm/llms/azure.py", line 779, in acompletion
    raise e
  File "/usr/local/lib/python3.12/site-packages/litellm/llms/azure.py", line 731, in acompletion
    headers, response = await self.make_azure_openai_chat_completion_request(
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/litellm/llms/azure.py", line 485, in make_azure_openai_chat_completion_request
    raise e
  File "/usr/local/lib/python3.12/site-packages/litellm/llms/azure.py", line 477, in make_azure_openai_chat_completion_request
    raw_response = await azure_client.chat.completions.with_raw_response.create(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_legacy_response.py", line 367, in wrapped
    return cast(LegacyAPIResponse[R], await func(*args, **kwargs))
                                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 1339, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1815, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1509, in request
    return await self._request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1610, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'statusCode': 401, 'message': 'Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.'}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/usr/local/lib/python3.12/site-packages/tenacity/_asyncio.py", line 50, in __call__
    result = await fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/pr_agent/algo/ai_handlers/litellm_ai_handler.py", line 216, in chat_completion
    response = await acompletion(**kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/litellm/utils.py", line 1579, in wrapper_async
    raise e
  File "/usr/local/lib/python3.12/site-packages/litellm/utils.py", line 1399, in wrapper_async
    result = await original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/litellm/main.py", line 424, in acompletion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/litellm/utils.py", line 8301, in exception_type
    raise e
  File "/usr/local/lib/python3.12/site-packages/litellm/utils.py", line 8087, in exception_type
    raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AzureException AuthenticationError - Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
  File "/app/pr_agent/algo/pr_processing.py", line 329, in retry_with_fallback_models
    return await f(model)
           ^^^^^^^^^^^^^^
  File "/app/pr_agent/tools/pr_reviewer.py", line 163, in _prepare_prediction
    self.prediction = await self._get_prediction(model)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/pr_agent/tools/pr_reviewer.py", line 185, in _get_prediction
    response, finish_reason = await self.ai_handler.chat_completion(
                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/tenacity/_asyncio.py", line 88, in async_wrapped
    return await fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/tenacity/_asyncio.py", line 47, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/tenacity/__init__.py", line 326, in iter
    raise retry_exc from fut.exception()
tenacity.RetryError: RetryError[<Future at 0x7ff06c5f9340 state=finished raised AuthenticationError>]
2024-08-23 07:48:49.452 | ERROR    | pr_agent.tools.pr_reviewer:run:152 - Failed to review PR: RetryError[<Future at 0x7ff06c5f9340 state=finished raised AuthenticationError>]

There are no logged errors in the Azure OpenAI admin panel, nor there's anything else seemingly suspicious.

Do you have any idea on what could it be caused by? I am slowly losing my mind over this issue as I can't seem to pinpoint any specific thing and internet is rather unhelpful when Googling this error.

DeviousM commented 3 weeks ago

Ok, nevermind, sorry about the trouble. The issue was that Bitbucket pipelines do not replace repository variables in the .pr_agent.toml and tries to make the requests with the wrong key.

I'm going to leave this issue here for anyone potentially struggling with the same issue.