Closed huangzhiyuan closed 2 years ago
Hi @huangzhiyuan
We recommend running ONNX models through ACL using ArmNN's ExecuteNetwork
.
Please take a look at https://github.com/ARM-software/armnn/tree/branches/armnn_22_02/tests/ExecuteNetwork
You can run a model like this:
LD_LIBRARY_PATH=.:/usr/lib/aarch64-linux-gnu//mali/fbdev ./ExecuteNetwork -m ../tflite_models/mobilebert_float_20191023.tflite -c CpuAcc -f tflite-binary -i input_ids -o end_logits
You will have to replace the model name and use the option -f onnx-binary,
Hope this helps.
@morgolock Thanks for your help. I met runtime errors like this:
./ExecuteNetwork -f onnx-binary -m test.onnx -c Cpuacc -i arg1,arg2 --infer-output-shape -s 1,128,50:1,50,3 -p
Fatal: The list of preferred devices contains invalid backend IDs: Cpuacc
Warning: No input files provided, input tensors will be filled with 0s.
Info: ArmNN v28.0.0
Can't load libOpenCL.so: libOpenCL.so: cannot open shared object file: No such file or directory
Can't load libGLES_mali.so: libGLES_mali.so: cannot open shared object file: No such file or directory
Can't load libmali.so: libmali.so: cannot open shared object file: No such file or directory
Couldn't find any OpenCL library.
Info: Initialization time: 1.12 ms.
Fatal: Armnn Error: Some backend IDs are invalid: Cpuacc
Info: Shutdown time: 0.01 ms.
How can I use ACL for acceleration in arm CPU?
Hi @huangzhiyuan
Please change Cpuacc
by CpuAcc
with uppercase A
.
My negligence : (. NOW the result is,
./ExecuteNetwork -f onnx-binary -m test.onnx -c CpuAcc -i arg1,arg2 --infer-output-shape -s 1,128,50:1,50,3 -p
Warning: No input files provided, input tensors will be filled with 0s.
Info: ArmNN v28.0.0
Can't load libOpenCL.so: libOpenCL.so: cannot open shared object file: No such file or directory
Can't load libGLES_mali.so: libGLES_mali.so: cannot open shared object file: No such file or directory
Can't load libmali.so: libmali.so: cannot open shared object file: No such file or directory
Couldn't find any OpenCL library.
Info: Initialization time: 1.08 ms.
Fatal: Armnn Error: Unsupported operation Slice for node 'Slice_4' at function LoadGraph [/devenv/armnn/src/armnnOnnxParser/OnnxParser.cpp:886]
Info: Shutdown time: 0.01 ms.
Seems that some operators(e.g. slice) do not support ONNX parsing in LoadGraph phase temporarily.
Hi @huangzhiyuan
Unfortunately our onnx parser does not support the op Slice.
What model are you trying to run? Our tflite parser has support for more operators than the onnx parser. Is your model also available in tflite format?
The model I used is ONNX fmt, no tflite format available.
Hi @huangzhiyuan This is an example of how to add an operator to the ONNX parser in Arm NN: https://review.mlplatform.org/c/ml/armnn/+/6281
If you are interested and have time to add SLICE, you could create a Gerrit review account at mlplatform.org and make your contribution there. You can use your GitHub credentials when creating your account.
The process for contributing to ArmNN is outlined in this Contributor Guide.
Please let us know if you find any issue.
hi, I want to use ACL to test the performance and accuracy of the ONNX & PT model. Is there any method or document? What I only found is ACL Execution Provider in onnxruntime (https://onnxruntime.ai/docs/execution-providers/ACL-ExecutionProvider.html). Any suggestions?