aws-samples / amazon-textract-transformer-pipeline

Post-process Amazon Textract results with Hugging Face transformer models for document understanding
MIT No Attribution
88 stars 25 forks source link

SageMaker Async Inference with scale-to-zero #13

Closed athewsey closed 2 years ago

athewsey commented 2 years ago

Issue #, if available: #8

Description of changes:

Additional updates:

Targeting this change against support/1.x before porting to main.

Testing done:

Both sync and async flows run successfully in test environment. Async endpoint correctly scales to 0 instances under no-traffic and re-starts to serve incoming requests when required.


By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.