vllm-project / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
30.66k stars 4.65k forks source link

[Feature]: Multiple Secret Keys #9698

Open CHesketh76 opened 4 weeks ago

CHesketh76 commented 4 weeks ago

🚀 The feature, motivation and pitch

I noticed that when kicking off the vllm api server I am limited to a single secret key. I have multiple users and I would like to have the ability to drop there service for users miss using my api. Would this be possible?

Alternatives

No response

Additional context

No response

Before submitting a new issue...

DarkLight1337 commented 3 weeks ago

This is outside the scope of vLLM. I recommend you implement your own security layer on top of the vLLM service.