Open franciscohanna92 opened 4 years ago
AWS have a service: https://aws.amazon.com/es/pytorch/ locally we could use the PyTorch Docker image. In production the AWS service...
Hmm we could have sage maker for that but it's a bit expensive. We will use it just if it's the last option.
What about dockerizing the model and deploying it using AWS ECS?
Lamdbas will last until we reach a high traffic (it will become expensive). We should consider basing the architecture on AWS ECS, so it can help to handle cost.
The project is too heavy to run on AWS Lambda. It throws the following error when trying to deploy:
Torch is the heaviest dependency. Almost 800MB.