aws / sagemaker-inference-toolkit

Serve machine learning models within a 🐳 Docker container using 🧠 Amazon SageMaker.
Apache License 2.0
370 stars 82 forks source link

fix: fix loading user custom script #112

Open SuperBo opened 1 year ago

SuperBo commented 1 year ago

Issue #86: I think the issue is very well mentioned here https://github.com/aws/sagemaker-inference-toolkit/issues/86 Description of changes:

Change PYTHONPATH environment after loading your python program won't affect the import paths that python will use to import libraries and modules. In order to change the import paths, we need to change "sys.path" variable.

Testing done:

Merge Checklist

Put an x in the boxes that apply. You can also fill these out after creating the PR. If you're unsure about any of them, don't hesitate to ask. We're here to help! This is simply a reminder of what we are going to look for before merging your pull request.

General

Tests

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

Otje89 commented 1 year ago

What is the status of this merge request? I'm experiencing the same issue and found that it was indeed caused by the same line of code. It would be great to have this updated.

Alex-Wenner-FHR commented 5 months ago

I am still experiencing this issue in 2024. Has there been any fix? This looks like a simple solution given this is indeed the issue.

ViktorMalesevic commented 3 months ago

Same issue here. When adding sagemaker-inference to my own container, this line: https://github.com/aws/sagemaker-inference-toolkit/blob/f79853d4e307a983ec5d76980be5ed6a9feb8c0e/src/sagemaker_inference/default_handler_service.py#L52 forces all modules that an entrypoint will use to be copied to "/opt/ml/model/code/" and the entrypoint to be placed at the root of "/opt/ml/model/code/". This is not documented in https://github.com/aws/sagemaker-inference-toolkit/tree/master?tab=readme-ov-file#implementation-steps