microsoft / onnx-server-openenclave

An Open Enclave port of the ONNX inference server with data encryption and attestation capabilities to enable confidential inference on Azure Confidential Computing.
MIT License
55 stars 9 forks source link

Create Inference Data #12

Open benirungu opened 2 years ago

benirungu commented 2 years ago

While running this command python3 -m confonnx.create_test_inputs --model model.onnx --out input.json

I get the error File "/home/ai/ben/final/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 370, in _create_inference_session sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from model.onnx failed:Protobuf parsing failed.