An Open Enclave port of the ONNX inference server with data encryption and attestation capabilities to enable confidential inference on Azure Confidential Computing.
While running this command python3 -m confonnx.create_test_inputs --model model.onnx --out input.json
I get the error File "/home/ai/ben/final/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 370, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from model.onnx failed:Protobuf parsing failed.
While running this command python3 -m confonnx.create_test_inputs --model model.onnx --out input.json
I get the error File "/home/ai/ben/final/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 370, in _create_inference_session sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from model.onnx failed:Protobuf parsing failed.