Closed quantum-fusion closed 3 years ago
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Hi @quantum-fusion - here's an example notebook of how you could use a model from onnx model zoo: https://github.com/bentoml/gallery/blob/master/onnx/resnet50/resnet50.ipynb
@parano Thank you for raising this for my attention, I am investigating how to convert CoreML from Apple to ONNX and then model host, this was perfect timing. I will investigate how to use an ONNX model from the zoo, and deploy using BentoML. The one thing that I am unsure of, is the pre-processing that is required, with use of BentoML that requires shrinking the aspect size ratio of the input picture to the REST API service. That is one limitation that I wish BentoML could get around. Any idea about that one?
@quantum-fusion BentoML allows the user to define arbitrary pre-processing logic in python, "BentoML requires shrinking the aspect size ratio of the input picture to the REST API service" is not true, I'd suggest you take a look at example notebooks in the BentoML gallery repo.
@parano What I want to do is model host an ONNX model from the model zoo, or a custom ONNX model. I currently use TorchVision but this is not an option for ONNX models. Do you know what the method is to model host ONNX for REST api model hosting? I want an input picture .jpg , and a REST api response in message json format.
@quantum-fusion you can definitely use BentoML for that, have you looked at the ONNX example notebook in the bentoml/gallery repository?
@quantum-fusion Yes, I looked at the juypter notebook, but this is not the type of configuration that I am trying to do. I am more interested in the type of model hosting available with TorchServe. (https://jtekds.com/announcing-torchserve-an-open-source-model-server-for-pytorch/). In short, to explain what I am trying to do. I have a model zoo available from (https://github.com/pytorch/serve/blob/master/docs/model_zoo.md). The current problem is that I have a model zoo in .MAR format, lets assume I can convert to ONNX from a variety of tools, and I want to model host from a model Zoo from ONNX like shown in the model zoo here (https://github.com/onnx/models). Now the problem I want to solve, is I want to model host the ONNX model. Lets assume that I want to model host with Resnet50.ONNX.
@parano Yes, I looked at the juypter notebook, but this is not the type of configuration that I am trying to do. I am more interested in the type of model hosting available with TorchServe. (https://jtekds.com/announcing-torchserve-an-open-source-model-server-for-pytorch/). In short, to explain what I am trying to do. I have a model zoo available from (https://github.com/pytorch/serve/blob/master/docs/model_zoo.md). The current problem is that I have a model zoo in .MAR format, lets assume I can convert to ONNX from a variety of tools, and I want to model host from a model Zoo from ONNX like shown in the model zoo here (https://github.com/onnx/models). Now the problem I want to solve, is I want to model host the ONNX model. Lets assume that I want to model host with Resnet50.ONNX.
I need a REST API server like TorchServe, but maybe call it BentoML serve, something like that, and I want to model host Resnet50.ONNX
@parano TorchServe does it like this (torchserve --start --model-store model_store --models densenet161=densenet161.mar). I am interested in the BentoML equivalent that allows me to model host ONNX models like Resnet50. The problem I have is that I currently have no way of converting ONNX to .MAR format
I want to download and install the Resnet50 model from the ONNX model Zoo. https://github.com/onnx/models/blob/master/vision/classification/resnet/model/resnet50-v2-7.onnx
I want to load and then start the BentoML REST API service, but do not see any example of how to do so.
The closest example there was, is here (https://github.com/bentoml/BentoML/issues/963), but it was in Jupyter notebook form, and there was a syntax error. It also did not load the model.
I need an example of how to install an ONNX model from the model zoo, and deploy it for REST API on a MACbook.
Please advise.