onnx / tutorials

Tutorials for creating and using ONNX models
Apache License 2.0
3.39k stars 629 forks source link

ONNXRuntime Server in windows #200

Open anirudha16101 opened 4 years ago

anirudha16101 commented 4 years ago

Hi, Can someone please help me How can I build and use the OnnxRuntime Server in windows which can support gRPC and HTTP.

I have made a c++ api which take a image as input and uses onnx model for Inference,Now I want to create the server for model inference. Any help would be appreciated. Thanks!!

askhade commented 4 years ago

@anirudha16101 : Please check the following documentation around this: Usage: https://github.com/microsoft/onnxruntime/blob/6d4f2f5bf92f86a8bbe52b6ecfed4bd1eb5e4ed1/docs/ONNX_Runtime_Server_Usage.md

Tutorial: https://github.com/onnx/tutorials/blob/master/tutorials/OnnxRuntimeServerSSDModel.ipynb

anirudha16101 commented 4 years ago

@askhade Thank You , I saw the document but the sever is only available in Linux. Is there is any way that we can build it in windows. I am not sure is it available in windows or not ? Thanks for Help!!

hariharans29 commented 4 years ago

Hi @anirudha16101,

The tutorial being referred to above is mostly to depict the usage of the ONNX Runtime server once you have the server running. The ONNX Runtime server runs only on Linux at this point and hence the server itself would have to run on a Linux machine. You can run the rest of the notebook on a Windows machine (change the appropriate endpoints).