build tensorflow_model_server cc binary with tcmalloc using malloc argument.
launch background thread before server starts using tcmalloc::MallocExtension::ProcessBackgroundActions (reference)
Set soft limit, and add tcmalloc_soft_limit argument for it
Describe the solution
How about providing tensorflow serving compiled with tcmalloc? I know I can use jemalloc instead, but jemalloc has so much configurations, while tcmalloc doesn't. It is easy to use and has great performance.
Feature Request
Describe the problem the feature is intended to solve
Recently I found that building tensorflow serving with tcmalloc and set soft limit can mitigating these kind of memory issue: https://github.com/tensorflow/serving/issues/2142 , https://github.com/tensorflow/serving/issues/1664. Also, I could get slightly better performances.
Here's what I did.
tensorflow_model_server
cc binary with tcmalloc using malloc argument.tcmalloc::MallocExtension::ProcessBackgroundActions
(reference)Describe the solution
How about providing tensorflow serving compiled with tcmalloc? I know I can use jemalloc instead, but jemalloc has so much configurations, while tcmalloc doesn't. It is easy to use and has great performance.