Closed OskarLiew closed 10 hours ago
I was able to solve the issue by removing the prefix stripper, but I was caught off guard as to why authorization had stopped working
Indeed, I can also reproduce this issue locally. @DarkLight1337 could you please confirm this issue as well?
I will try to fix this problem.
Sorry, I don't have time to debug this. I can help review your PR though.
@OskarLiew Hi, Can you please help me test #10606 again? I was able to solve this issue when testing it locally.
Your current environment
The output of `python collect_env.py`
```text Your output of `python collect_env.py` here ```Model Input Dumps
No response
š Describe the bug
I was running vllm behind a route based proxy (traefik) and noticed that I could use the API without any token.
The problem seems to be because the API is still available on the default path
/v1/....
and not just on/root_path/v1/....
but the key is only verified forroot_path/v1/...
. I was stripping the prefix, so I was hitting the/v1/...
endpoint but needed to set the root path to be able to fetch the OpenAPI schema for swagger.I was running the
vllm/vllm-openai:v0.6.4
imageHere is a minimal example that reproduces the bug:
Then the following request still works without authentication error
Sending a request to
localhost:8000/llm/v1/chat/completions
works as expectedI've checked the source code and the issue seems to stem from the authentication middleware in
vllm/entrypoints/openai/api_server.py:481
Before submitting a new issue...