Open saimidu opened 2 years ago
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository
Powered by github-codebuild-logs, available on the AWS Serverless Application Repository
Issue #, if available:
Description of changes: The
model_name
parameter when making a non-MME inference request, in the presence of a user's inference script, is treated asNone
by the python service for SM inference. This means that thecontext
passed to the inference handler does not have a validmodel_name
attribute.Therefore, GRPC-based inference requests for a model cannot be programmatically configured, and must instead always be hardcoded in the user's inference script. This requirement makes it harder to compile a model, or to use it with SageMaker Inference Recommender to produce a Neo-compilation-compatible model.
This PR makes a change to
tfs_utils.py
to use thedefault_model_name
, obtained from theTFS_DEFAULT_MODEL_NAME
environment variable, as the final option in case the input model_name isNone
and the request attributes do not contain the model name.By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.