aws / sagemaker-inference-toolkit

Serve machine learning models within a 🐳 Docker container using 🧠 Amazon SageMaker.
Apache License 2.0
370 stars 82 forks source link

handle max_request_size param set by SM platofrm #121

Closed rohithkrn closed 1 year ago

rohithkrn commented 1 year ago

Description of changes: sagemaker batch transform sets SAGEMAKER_MAX_PAYLOAD_IN_MB env var which is currently not handled and passed to the model server. Therefore, model server sets deafult max_request_size ignoring the value set by user. This change fixes that.

Testing done:

General

Tests

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

sagemaker-bot commented 1 year ago

AWS CodeBuild CI Report

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository