triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
8.4k stars 1.49k forks source link

Cherry-pick: Fix: Add mutex lock for state completion check in gRPC streaming to prevent race condition #7618

Closed pskiran1 closed 2 months ago

pskiran1 commented 2 months ago

What does the PR do?

Cherry-pick #7617

pskiran1 commented 2 months ago

Created for wrong target.