awslabs / mlmax

Example templates for the delivery of custom ML solutions to production so you can get started quickly without having to make too many design choices.
https://mlmax.readthedocs.io/en/latest/
Apache License 2.0
66 stars 19 forks source link

Running inference_pipeline_run.py failed #92

Closed yinsong1986 closed 2 years ago

yinsong1986 commented 3 years ago

🐛 Bug report

Describe the bug

When running inference_pipeline_run.py, it might have issues like: Traceback (most recent call last): File "inference_pipeline_run.py", line 185, in example_run_inference_pipeline(workflow_arn, region) File "inference_pipeline_run.py", line 100, in example_run_inference_pipeline proc_model_s3, model_s3 = get_latest_models() File "inference_pipeline_run.py", line 39, in get_latest_models processing_job_name = response["ProcessingJobSummaries"][0]["ProcessingJobName"] IndexError: list index out of range

To reproduce

Not easily to reproduce, it depends on the Sagemaker API.

Expected behavior

A clear and concise description of what you expected to happen.

System information

github-actions[bot] commented 2 years ago

This issue is stale because it has been open for 60 days with no activity. Please update or respond to this comment if you're still interested in working on this.