Open MaximeLanglet opened 4 months ago
The error message you're encountering indicates an issue with the authentication process with Azure's OpenAI service, specifically pointing out that the access token might be missing, invalid, has an incorrect audience, or may have expired. This suggests a problem with how the authentication parameters are being utilized.
From the code snippet you've shared, it looks like you're initializing the AzureOpenAI
class correctly with the necessary parameters. However, considering the error, there might be a discrepancy in how these parameters are being used for authentication.
Here are a few potential solutions and insights based on similar issues:
Environment Variables Before Module Import: Ensure you set the environment variable for the API key before importing the AzureOpenAI
module. This approach has resolved similar issues for other users.
Using ServiceContext: In some cases, using the ServiceContext
for initializing the AzureOpenAI
class has been suggested. Although this was specifically mentioned for a different version, it might be worth exploring:
from llama_index import ServiceContext, set_global_service_context
service_context = ServiceContext.from_defaults(
llm=AzureOpenAI(engine="gpt-4-32k", model="gpt-4-32k"),
)
set_global_service_context(service_context)
Custom Embeddings Setup: If customization is part of your implementation, ensure both the LLM and embedding model are correctly set in the service context. This is crucial for avoiding reliance on the default OpenAI model.
API Key Validation Issue: There was a known bug related to API key validation that has been addressed. Although your version might not be directly affected, checking the specifics of this fix could provide some clues.
If these suggestions do not resolve the issue, more details about how you're setting the environment variables and any additional context about your application's deployment might help in further diagnosing the problem.
I'm not an azure expert, but seems like some config is missing. This works fine for me on my azure setup
api_key = "<api-key>"
azure_endpoint = "https://<your-resource-name>.openai.azure.com/"
api_version = "2023-07-01-preview"
llm = AzureOpenAI(
model="gpt-35-turbo-16k",
deployment_name="my-custom-llm",
api_key=api_key,
azure_endpoint=azure_endpoint,
api_version=api_version,
)
@logan-markewich thanks for the response. Which config are you referring to precisely ? can't seem to find something wrong
I meant some setup for for your LLM seems incorrect, otherwise you wouldn't get that error.
The above code I posted works for my azure deployment š¤·š»āāļø may even double check the values are being set as expected (I see you are using os envs, maybe try hardcoding to ensure?)
Hi @MaximeLanglet did you found any solution regarding this issue?
Bug Description
While trying to execute a basic script using the Azure OpenAI models, I encounter this error:
openai.AuthenticationError: Error code: 401 - {'statusCode': 401, 'message': 'Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.'}
I verified that the api keys utilized are right of course. The code that raises the error is the following:
I can't seem to figure out what is wrong, I tried with different llama-index versions without any luck. Does someone know what is causing the error ?
Version
0.10.19
Steps to Reproduce
Executre the following code in a python file or it could be in a notebook.
Relevant Logs/Tracbacks