triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
8.07k stars 1.45k forks source link

Hardening guide for Triton Server #4529

Open jax79sg opened 2 years ago

jax79sg commented 2 years ago

Is your feature request related to a problem? Please describe. My organisation has strict security requirements and one of the baselines are hardening guides to lock down the server to the bare minimum.

Describe the solution you'd like A guide or script to offer hardening options to users. Some examples are locking down of certain directories,.

Describe alternatives you've considered Installing of Triton on hardened images or OSes.

Additional context None

dyastremsky commented 2 years ago

I've file this ticket as an enhancement. An overview of security options could be useful.

For now, you may want to look at what security concerns you specifically have. We generally create Triton with security concerns in mind. When you go to the NGC catalog and preview the Triton Docker image (e.g. 22.05), you can find the results of the security scan that we do before every release. We have plenty of users with strict security requirements, some of which choose to be featured on the NVIDIA blog, Google News (for those releasing a press release about their use of Triton), or the NVIDIA product page here. If you have specific questions, concerns, or requirements, let us know.

For now, I'll file an enhancement to see if we can write up a security or hardening guide.