Closed xtr33me closed 5 years ago
Have you had any luck with this issue? I'm looking for any example/some guidance to export a model for serving using the SavedModelBuilder for a text based model (sentiment analysis in my case).
Hey Joel...unfortunately not. There have been a lot of recent changes in my life I have been focusing on but I am going to be picking this back up soon and attempting to figure it out. SavedModelBuilder I believe got me further than I was before, however I was getting security issues when attempting to serve the model. I also tried using the Docker container, but that was a failure as well. I even reached out and offered to pay some peeps that had more experience and they too weren't able to get around a number of issues. So not promising anything, but should I figure it out I will post here.
Closing as this is a narrow use case for which we have no plans to add docs. If someone wants to contribute, please feel free to re-open.
I have not been having very good luck with figuring out how to export the decode functionality of the Textsum model for serving and would love it if someone could provide a Textsum decode example compatible with TensorFlow serving like the MNIST or inception example. I am interested in performing both batch processing as well as single request processing.