Closed johann-petrak closed 2 years ago
Unfortunately, just for training (in our case language models from scratch, see blog, example script for the container and marketplace algorithm ). I guess you are more interested in the inference side of things?! We don't use SageMaker there - so probably cannot help that much there.
Thank you for those pointers, I will check them out! Yes, in my case mainly inference, but as I am totally new with AWS / SageMaker, everything helps.
OK, short update: to get inference to work with SageMaker the "standard" approach would be to get inference to work with torchserve Did anyone ever try to do that? TBH I do not really understand why torchserve does things the way it wants to do it and what the best way could be to fit FARM inference into this.
hi @johann-petrak, it is definitely possible to use FARM for serving predictions on a SageMaker endpoint by wrapping it up with an API like Flask and containerizing all of the code responsible for inference.
The basic need is for the API to respond to requests against /invocations and /ping in order for it to be compatible with SageMaker's serving framework. Some documentation that could help you can be found on this link
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 21 days if no further activity occurs.
Especially deploying a FARM model on AWS SageMaker?