triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
8.38k stars 1.49k forks source link

Can triton server support trace_id generator config? #7817

Open stknight43 opened 5 days ago

stknight43 commented 5 days ago

We want to use tracing in triton-server, but we discovery that trace id in triton is just an auto incr numuber which is not the trace id define in w3c https://www.w3.org/TR/trace-context/ which trace id is a 16-byte array, for example, 4bf92f3577b34da6a3ce929d0e0e4736. This cause our otlp collector drop the trace id in triton. Can triton support trace_id generator config?