ONNC / onnc

Open Neural Network Compiler
https://onnc.ai
BSD 3-Clause "New" or "Revised" License
510 stars 93 forks source link

ONNX Zoo's basic MNIST model fails #178

Open robinvanemden opened 4 years ago

robinvanemden commented 4 years ago

ONNC produces invalid outcomes when compiling MNIST model v1.3 from the ONNX model repo.

Building the inference runtime as described in the backend guide and running it on test_data_set_1 results in:

> ./inference mnist.input onnc-runtime-service.weight

[-0.044384, 0.010354, 0.074052, 0.020479, -0.131909, 0.145801, -0.053591, -0.047789, 0.084736, -0.057945, ]

When compiling directly from source files I obtain:

gcc -fno-exceptions -I./include/ -I /onnc/onnc/include  \
./src/client-lib.c ./src/onnc-runtime-core.c  ./src/client-app.c  ./src/onnc-runtime-service.c \
/onnc/onnc/lib/Runtime/operator/randomnormal.c \

... etc, all c operator files ... 

/onnc/lib/Runtime/operator/gather.c  \
-o ./inference -lm

> ./inference mnist.input onnc-runtime-service.weight

[-2427.208984, -319.598022, 1552.176880, 111.066208, 1734.641357, -1012.644592, -1362.241211, 1107.857178, -280.093781, 117.853188, ]

... both are incorrect, as the result defined in output_0.pb reads:

[5041.8887, -3568.878, -187.82423, -1685.797, -1183.3232, -614.42926, 892.6643, -373.65845    -290.2623, -111.176216]

(Directly compiled Squeezenet works fine - which seems to indicate this is MNIST related?)

ajaya1274 commented 3 years ago

-We have build inference runtime as described in the backend guide. -It's working properly in the case of alexnet model and displaying the output accordingly on test_data_set -But when we are using . /inference mnist.input onnc-runtime-service.weight(which is for running mnist inference it is giving segmentfault(coredump). So kindly share the steps for creating mnist inference so that we can do the required changes and recreate the issue.