Closed luisdreisbach closed 1 week ago
Hey @luisdreisbach, I did not run into the issue brought up when running the authentication tests included in the repo (using an older package version of openai but also using the latest 1.35.4). Can you please check again and provide code to repro if the problem still persists?
Hey, I think this is not related to the authentication test because I do not want to use Azure AD as authentication at all. To reproduce I use the simple config.local.yaml
(no need to put in real secrets, it fails previously)
clients:
- name: Client
description: Main client
key: 04ae14bc78184621d37f1ce57a52eb7
deployments_allowed: [gpt-4o]
max_tokens_per_minute_in_k: 500
plugins:
- name: AllowDeployments
- name: LimitUsage
- name: LogUsageToConsole
- name: LogUsageToCsvFile
# Azure OpenAI
aoai:
endpoints:
- name: gpt-4o-deployment
url: https://___.openai.azure.com
key: ___
non_streaming_fraction: 1
connections:
limits:
max_connections: 100
max_keepalive_connections: 20
keepalive_expiry: 5
timeouts:
connect: 15
read: 120
write: 120
pool: 120
virtual_deployments:
- name: gpt-4o
standins:
- name: ___
non_streaming_fraction: 1
start the proxy with
python powerproxy.py --config-file ../config/config.local.yaml
and run
python test/python/test_non_streaming.py --deployment-name gpt-4o
which leads to
Traceback (most recent call last):
File "/Users/powerproxy-aoai/test/python/test_non_streaming.py", line 28, in <module>
response = client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/powerproxy-aoai/.venv/lib/python3.11/site-packages/openai/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/Users/powerproxy-aoai/.venv/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 590, in create
return self._post(
^^^^^^^^^^^
File "/Users/powerproxy-aoai/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1240, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/powerproxy-aoai/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 921, in request
return self._request(
^^^^^^^^^^^^^^
File "/Users/powerproxy-aoai/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1020, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': "When Entra ID/Azure AD is used to authenticate, PowerProxy needs a client in itsconfiguration configured with 'uses_entra_id_auth: true', so PowerProxy can map the request to a client."}
It's the same error for openai version 1.30.3 and 1.35.5
Thanks, I will take a look into it.
@luisdreisbach I have just released a new version which fixes the problem.
I wrongly receive the warning that Entra ID / Azure ID is not set up.
Steps to reproduce
Expected behavior
Response from underlying OpenAI model deployment
Actual behavior
Entra ID/Azure AD as authentication method is identified.
Background
It seems that the OpenAI SDK sends the API key additionally via the
authentication
header (link to code). This triggers the Entra ID/Azure AD check.With OpenAI SDK version 1.35.1 and powerproxy version 0.11