Open robinvanemden opened 4 years ago
-We have build inference runtime as described in the backend guide. -It's working properly in the case of alexnet model and displaying the output accordingly on test_data_set -But when we are using . /inference mnist.input onnc-runtime-service.weight(which is for running mnist inference it is giving segmentfault(coredump). So kindly share the steps for creating mnist inference so that we can do the required changes and recreate the issue.
ONNC produces invalid outcomes when compiling MNIST model v1.3 from the ONNX model repo.
Building the inference runtime as described in the backend guide and running it on test_data_set_1 results in:
When compiling directly from source files I obtain:
... both are incorrect, as the result defined in output_0.pb reads:
(Directly compiled Squeezenet works fine - which seems to indicate this is MNIST related?)