aws / sagemaker-inference-toolkit

Serve machine learning models within a 🐳 Docker container using 🧠 Amazon SageMaker.
Apache License 2.0
379 stars 82 forks source link

feature: support requirements.txt #12

Closed ChoiByungWook closed 4 years ago

ChoiByungWook commented 4 years ago

Issue #, if available: https://github.com/aws/sagemaker-inference-toolkit/issues/9 https://github.com/aws/sagemaker-python-sdk/issues/957 https://github.com/aws/sagemaker-python-sdk/issues/664

Description of changes: Added code to support installing requirements.txt.

This makes the assumption that the requirements.txt will be in the /opt/ml/model/code/ as defined in the environments.py and in the tensorflow-serving-container.

The unit tests are very bare... I was able to patch subprocess.check_call, however when I call it with assert_called_once_with, it errors out stating that it was never called... even though the side effect shouldn't happen if that were true.

msg = "Expected 'check_call' to be called once. Called 0 times."

Testing tox tests

flake8: commands succeeded
  twine: commands succeeded
  py27: commands succeeded
  py36: commands succeeded
  congratulations :)

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.