ARM-software / armnn

Arm NN ML Software. The code here is a read-only mirror of https://review.mlplatform.org/admin/repos/ml/armnn
https://developer.arm.com/products/processors/machine-learning/arm-nn
MIT License
1.17k stars 309 forks source link

Error compiling `OnnxMnist-Armnn.cpp` #774

Closed qianfei11 closed 3 months ago

qianfei11 commented 3 months ago

I tried to compile OnnxMnist-Armnn.cpp to try a simple model inference, but met this error:

arm-user@00b8c6c736bc:~/source/armnn/tests/OnnxMnist-Armnn$ aarch64-linux-gnu-g++ OnnxMnist-Armnn.cpp -o OnnxMnist-Armnn -L/home/arm-user/build/armnn/aarch64_build -I/home/arm-user/build/armnn/aarch64_build/include -I/home/arm-user/source/armnn/profiling -I/home/arm-user/source/armnn/third-party -I/home/arm-user/source/armnn/src/armnnUtils -larmnn -larmnnOnnxParser -larmnnUtils -lprotobuf -lpthread
In file included from ../InferenceTest.hpp:7,
                 from OnnxMnist-Armnn.cpp:5:
../InferenceModel.hpp: In instantiation of ‘static armnn::INetworkPtr CreateNetworkImpl<IParser>::Create(const Params&, std::vector<std::pair<int, armnn::TensorInfo> >&, std::vector<std::pair<int, armnn::TensorInfo> >&) [with IParser = armnnOnnxParser::IOnnxParser; armnn::INetworkPtr = std::unique_ptr<armnn::INetwork, void (*)(armnn::INetwork*)>; CreateNetworkImpl<IParser>::Params = InferenceModelInternal::Params]’:
../InferenceModel.hpp:451:76:   required from ‘InferenceModel<IParser, TDataType>::InferenceModel(const Params&, bool, const string&, const std::shared_ptr<armnn::IRuntime>&) [with IParser = armnnOnnxParser::IOnnxParser; TDataType = float; InferenceModel<IParser, TDataType>::Params = InferenceModelInternal::Params; std::string = std::__cxx11::basic_string<char>]’
/usr/aarch64-linux-gnu/include/c++/9/bits/unique_ptr.h:857:30:   required from ‘typename std::_MakeUniq<_Tp>::__single_object std::make_unique(_Args&& ...) [with _Tp = InferenceModel<armnnOnnxParser::IOnnxParser, float>; _Args = {InferenceModelInternal::Params&, const bool&, const std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >&}; typename std::_MakeUniq<_Tp>::__single_object = std::unique_ptr<InferenceModel<armnnOnnxParser::IOnnxParser, float>, std::default_delete<InferenceModel<armnnOnnxParser::IOnnxParser, float> > >]’
../InferenceTest.inl:409:60:   required from ‘int armnn::test::ClassifierInferenceTestMain(int, char**, const char*, bool, const char*, const char*, const std::vector<unsigned int>&, TConstructDatabaseCallable, const armnn::TensorShape*) [with TDatabase = MnistDatabase; TParser = armnnOnnxParser::IOnnxParser; TConstructDatabaseCallable = main(int, char**)::<lambda(const char*, const ModelType&)>]’
OnnxMnist-Armnn.cpp:28:39:   required from here
../InferenceModel.hpp:157:47: error: no matching function for call to ‘armnnOnnxParser::IOnnxParser::CreateNetworkFromBinaryFile(const char*, std::map<std::__cxx11::basic_string<char>, armnn::TensorShape>&, std::vector<std::__cxx11::basic_string<char> >&)’
  157 |             network = (params.m_IsModelBinary ?
      |                       ~~~~~~~~~~~~~~~~~~~~~~~~^
  158 |                 parser->CreateNetworkFromBinaryFile(modelPath.c_str(), inputShapes, requestedOutputs) :
      |                 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  159 |                 parser->CreateNetworkFromTextFile(modelPath.c_str(), inputShapes, requestedOutputs));
      |                 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In file included from OnnxMnist-Armnn.cpp:7:
/home/arm-user/build/armnn/aarch64_build/include/armnnOnnxParser/IOnnxParser.hpp:38:24: note: candidate: ‘armnn::INetworkPtr armnnOnnxParser::IOnnxParser::CreateNetworkFromBinaryFile(const char*)’
   38 |     armnn::INetworkPtr CreateNetworkFromBinaryFile(const char* graphFile);
      |                        ^~~~~~~~~~~~~~~~~~~~~~~~~~~
/home/arm-user/build/armnn/aarch64_build/include/armnnOnnxParser/IOnnxParser.hpp:38:24: note:   candidate expects 1 argument, 3 provided
/home/arm-user/build/armnn/aarch64_build/include/armnnOnnxParser/IOnnxParser.hpp:47:24: note: candidate: ‘armnn::INetworkPtr armnnOnnxParser::IOnnxParser::CreateNetworkFromBinaryFile(const char*, const std::map<std::__cxx11::basic_string<char>, armnn::TensorShape>&)’
   47 |     armnn::INetworkPtr CreateNetworkFromBinaryFile(const char* graphFile,
      |                        ^~~~~~~~~~~~~~~~~~~~~~~~~~~
/home/arm-user/build/armnn/aarch64_build/include/armnnOnnxParser/IOnnxParser.hpp:47:24: note:   candidate expects 2 arguments, 3 provided
In file included from ../InferenceTest.hpp:7,
                 from OnnxMnist-Armnn.cpp:5:
../InferenceModel.hpp:157:47: error: no matching function for call to ‘armnnOnnxParser::IOnnxParser::CreateNetworkFromTextFile(const char*, std::map<std::__cxx11::basic_string<char>, armnn::TensorShape>&, std::vector<std::__cxx11::basic_string<char> >&)’
  157 |             network = (params.m_IsModelBinary ?
      |                       ~~~~~~~~~~~~~~~~~~~~~~~~^
  158 |                 parser->CreateNetworkFromBinaryFile(modelPath.c_str(), inputShapes, requestedOutputs) :
      |                 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  159 |                 parser->CreateNetworkFromTextFile(modelPath.c_str(), inputShapes, requestedOutputs));
      |                 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In file included from OnnxMnist-Armnn.cpp:7:
/home/arm-user/build/armnn/aarch64_build/include/armnnOnnxParser/IOnnxParser.hpp:41:24: note: candidate: ‘armnn::INetworkPtr armnnOnnxParser::IOnnxParser::CreateNetworkFromTextFile(const char*)’
   41 |     armnn::INetworkPtr CreateNetworkFromTextFile(const char* graphFile);
      |                        ^~~~~~~~~~~~~~~~~~~~~~~~~
/home/arm-user/build/armnn/aarch64_build/include/armnnOnnxParser/IOnnxParser.hpp:41:24: note:   candidate expects 1 argument, 3 provided
/home/arm-user/build/armnn/aarch64_build/include/armnnOnnxParser/IOnnxParser.hpp:51:24: note: candidate: ‘armnn::INetworkPtr armnnOnnxParser::IOnnxParser::CreateNetworkFromTextFile(const char*, const std::map<std::__cxx11::basic_string<char>, armnn::TensorShape>&)’
   51 |     armnn::INetworkPtr CreateNetworkFromTextFile(const char* graphFile,
      |                        ^~~~~~~~~~~~~~~~~~~~~~~~~
/home/arm-user/build/armnn/aarch64_build/include/armnnOnnxParser/IOnnxParser.hpp:51:24: note:   candidate expects 2 arguments, 3 provided

It seems that ../InferenceModel.hpp use parser->CreateNetworkFromBinaryFile(modelPath.c_str(), inputShapes, requestedOutputs) to load the Model, but the statement of CreateNetworkFromBinaryFile only need 1 or 2 arguments:

    /// Create the network from a protobuf binary file on disk
    armnn::INetworkPtr CreateNetworkFromBinaryFile(const char* graphFile);

    /// Create the network from a protobuf text file on disk
    armnn::INetworkPtr CreateNetworkFromTextFile(const char* graphFile);

    /// Create the network directly from protobuf text in a string. Useful for debugging/testing
    armnn::INetworkPtr CreateNetworkFromString(const std::string& protoText);

    /// Create the network from a protobuf binary file on disk, with inputShapes specified
    armnn::INetworkPtr CreateNetworkFromBinaryFile(const char* graphFile,
                                                   const std::map<std::string, armnn::TensorShape>& inputShapes);

I am not familiar with armnn, if I remove the requestedOutputs, will this program still executing normal? Or maybe is there any tutorials of compiling the programs under tests folder?

Colm-in-Arm commented 3 months ago

Hi,

This OnnxMnist-Armnn.cpp test program is part of a larger set compiled using cmake and tests/CMakeLists.txt. I suggest you have a look at the build tool. The documentation is here: https://github.com/ARM-software/armnn/tree/branches/armnn_24_05/build-tool

Colm.

qianfei11 commented 3 months ago

Hi,

This OnnxMnist-Armnn.cpp test program is part of a larger set compiled using cmake and tests/CMakeLists.txt. I suggest you have a look at the build tool. The documentation is here: https://github.com/ARM-software/armnn/tree/branches/armnn_24_05/build-tool

Colm.

Thanks for replying!

I tried using build-tool to build the programs under tests folder, and it works. Here is the command if anyone else needs help.

$ ./build-armnn.sh --target-arch=aarch64 --all --cl-backend --armnn-cmake-args="-DBUILD_TESTS=1"