Open romain-intel opened 4 years ago
`from metaflow import WebServiceSpec from metaflow import endpoint
class MyWebService(WebServiceSpec):
@ednpoint def show_data(self, request):
return {}`
I did not find these classes in master branch. looking forward to seeing.
@wuqunfei Yes, you are correct, we haven’t released WebServiceSpec yet.
thanks, @savingoyal, it is a really good idea to expose endpoint. I am using sagemaker to host endpoint, but it is not so easy as your annotation.
Hi! Saw the talk on this topic yesterday and am really excited for this feature to be released in the future.
Is there a channel where development on this is being discussed? Would love to participate and help out.
Yes, @justinttl this issue tracks the development.
This is a good medium to communicate your ideas, etc. We are currently working on a redesign of the ideas expressed in the talk internally and, given sufficient interest, we will look at releasing this to the community as well. The talk (and associated slides) give a pretty good idea of where we are at right now. Comments/thoughts are more than welcome here!
Kubeflow uses seldon.io for deploying models; it may be worth looking into - or perhaps a pluggable layer with a seldon backend.
Hi @romain-intel , any information on the progress of this feature? Thanks!
How about we package the model artifact and deploy it using Sagemaker in the end
step?
It might be worth exploring using the Amazon SageMaker Training Toolkit as a way to create a managed docker container in Amazon SageMaker for a particular step.
@sl-victormazzeo We are looking into publishing recipes on how to deploy metaflow artifacts within a micro-service, independent of this effort. Today, you can access your artifacts via the metaflow client
within hosting services like Seldon, AWS Lambda etc.
@agusgun Yes, you can very simply use boto3 to package your model artifact and deploy it to Sagemaker in the end
step.
@brightsparc We do have a pending PR that tries to achieve something very similar.
Once Metaflow has been used to train a model, it produces artifacts that are typically persisted (for example in S3). A natural extension of this is to provide an easy mechanism to deploy web services that would take these artifacts and serve them in some way so that they can be consumed by downstream applications.