aws / sagemaker-inference-toolkit

Serve machine learning models within a 🐳 Docker container using 🧠 Amazon SageMaker.
Apache License 2.0
372 stars 82 forks source link

fix: Fixing issue #82 #83

Closed amaharek closed 3 years ago

amaharek commented 3 years ago

Issue #, if available: Fixing issue https://github.com/aws/sagemaker-inference-toolkit/issues/82

Description of changes: The change includes using the option `` as suggested in the documentation

Testing done: I have built a custom container using the patched version and the CPU count matches available CPU count in the container.

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

sagemaker-bot commented 3 years ago

AWS CodeBuild CI Report

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

vdantu commented 3 years ago

Curious to know why this PR is still pending?

henryhu666 commented 2 years ago

Hi, we are facing the same issue and would like to use this fix in Sagemaker. Is there a plan to cut a release anytime soon?