Closed feiyuvl closed 3 years ago
Not sure what you're asking. This is a testcase checking numerical accuracy.
When we do inference for a nerual network, for example resnet, we only init weight once, then with different input data. Does onnx-mlir support global variable to store weight parameter?
Yes, the weights are stored in a global constant pool
Thank you. Is there any example?
Not sure what you want to see. Take any trained model, use onnx-mlir to compile it, link it with a driver and run it multiple times with inputs. The weights are saved in the model binary. You do not need to specify them in order to run the inference. For an example use mnist or resnet or something else that is available out there.
According to the
test/numerical/TestConv.cpp
, we need to copy weight each time we call the convolution function. Is there any method to set weight once, and avoid copying each time when computing?