aws / sagemaker-tensorflow-serving-container

A TensorFlow Serving solution for use in SageMaker. This repo is now deprecated.
Apache License 2.0
174 stars 101 forks source link

feature: expose tunable parameters to support multiple tfs #188

Closed jinpengqi closed 3 years ago

jinpengqi commented 3 years ago

Implement new container environment which could expose additional parameters to help user optimize for throughput and fine tune container performance.

Co-authored-by: Liang Ma liangmaa@amazon.com

Issue #, if available:

Description of changes:

To elaborate, user could tune environment parameters such as SAGEMAKER_GUNICORN_WORKERS, SAGEMAKER_TFS_INSTANCE_COUNT, SAGEMAKER_TFS_INTER_OP_PARALLELISM, SAGEMAKER_TFS_INTRA_OP_PARALLELISM and etc. In this way container will be started with multiple of gunicorn workers and they are responsible for serving requests to different tensorflow model servers parallelly. Thus, throughput will be improved based on different parameters tuned.

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

sagemaker-bot commented 3 years ago

AWS CodeBuild CI Report

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository