Closed yeyupiaoling closed 6 years ago
这个意思应该是出错了。。。正确的输出应该是这样的:
odin:/data/local/tmp $ ./inference --merged_model ./mobilenet_flowers102.paddle --input_size 150528
I1211 17:12:53.334666 4858 Util.cpp:166] commandline:
Time of init paddle 3.4388 ms.
Time of create from merged model file 141.045 ms.
Time of forward time 398.818 ms.
我在PR #49 中写了这个例子的中文文档,其中详细介绍了每一个过程,也提供了参考模型下载,可以先试下。
@Xreki 我看到你的是没有WARNING: linker: /data/local/tmp/inference: unused DT entry: type 0xf arg 0x826
警告的,我猜应该是我的inference文件有问题
我有重新构建了
cmake .. \
-DANDROID_ABI=armeabi-v7a \
-DANDROID_STANDALONE_TOOLCHAIN=/home/wang/ubuntu/paddlepaddle/android/arm_standalone_toolchain \
-DPADDLE_ROOT=/home/wang/ubuntu/paddlepaddle/android/install \
-DCMAKE_BUILD_TYPE=MinSizeRel
输入的日志是:
-- The CXX compiler identification is GNU 4.9.0
-- The C compiler identification is GNU 4.9.0
-- Check for working CXX compiler: /home/wang/ubuntu/paddlepaddle/android/arm_standalone_toolchain/bin/arm-linux-androideabi-g++
-- Check for working CXX compiler: /home/wang/ubuntu/paddlepaddle/android/arm_standalone_toolchain/bin/arm-linux-androideabi-g++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Check for working C compiler: /home/wang/ubuntu/paddlepaddle/android/arm_standalone_toolchain/bin/arm-linux-androideabi-gcc
-- Check for working C compiler: /home/wang/ubuntu/paddlepaddle/android/arm_standalone_toolchain/bin/arm-linux-androideabi-gcc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Found PaddlePaddle (include: /home/wang/ubuntu/paddlepaddle/android/install/include/paddle; library: /home/wang/ubuntu/paddlepaddle/android/install/lib/armeabi-v7a/libpaddle_capi_shared.so)
-- Configuring done
-- Generating done
-- Build files have been written to: /media/wang/软件/ubuntu/Mobile-test/Mobile2/benchmark/tool/C/build
然后再执行make
,输出:
Scanning dependencies of target inference
[ 50%] Building CXX object CMakeFiles/inference.dir/inference.cc.o
[100%] Linking CXX executable inference
[100%] Built target inference
你看我的输出日志有问题吗?
我用文档中提供的mobilenet_flowers102.paddle
,也是包一样的错误
从cmake和make输出信息来看,是没有问题的。我是arm64-v8a的测试手机,刚刚我也编译了一个armeabi-v7a的版本,在测试手机上运行输出是这样的:
255|odin:/data/local/tmp $ ./inference --merged_model ./mobilenet_flowers102.paddle --input_size 150528
WARNING: linker: /data/local/tmp/inference: unused DT entry: type 0xf arg 0x720
I1212 16:11:39.299767 5910 Util.cpp:166] commandline:
Time of init paddle 4.17109 ms.
Time of create from merged model file 83.2597 ms.
Time of forward time 654.964 ms.
也出现了WARNING: linker: /data/local/tmp/inference: unused DT entry: type 0xf arg 0x720
这样的警告,但是还是能正常执行,没有出现paddle forward error!
。
所以,能提供一下你的测试手机的架构信息,以及你使用的独立工具链的Android API吗?
另外,我们这个inference.cc
里面不能输出错误类别信息,影响了错误的定位,这个我也会尽快修正。
我的inference.zip,你有空试试,不知道是不是我这的问题
我的独立工具链armeabi-v7a、 Android API 21:
/media/wang/软件/Android/ubuntu/android-ndk-r14b-linux-x86_64/build/tools/make-standalone-toolchain.sh \
--arch=arm --platform=android-21 --install-dir=/home/wang/ubuntu/paddlepaddle/android/arm_standalone_toolchain
另外我真机是API19的(酷派大神F2),所以我使用的模拟器,他的信息如下:
Name: ARM_android_7.1.1
CPU/ABI: Google APIs ARM (armeabi-v7a)
Path: /home/wang/.android/avd/ARM_android_7.1.1.avd
Target: google_apis [Google APIs] (API level 25)
Skin: nexus_4
SD Card: 100M
hw.dPad: no
hw.lcd.height: 1280
runtime.network.speed: full
hw.accelerometer: yes
hw.device.name: Nexus 4
vm.heapSize: 80
skin.dynamic: yes
hw.device.manufacturer: Google
hw.lcd.width: 768
hw.gps: yes
hw.initialOrientation: Portrait
image.androidVersion.api: 25
hw.audioInput: yes
image.sysdir.1: system-images/android-25/google_apis/armeabi-v7a/
hw.cpu.model: cortex-a8
tag.id: google_apis
showDeviceFrame: yes
hw.camera.back: emulated
hw.mainKeys: no
AvdId: ARM_android_7.1.1
hw.camera.front: emulated
hw.lcd.density: 320
avd.ini.displayname: ARM android 7.1.1
hw.gpu.mode: auto
hw.device.hash2: MD5:17b0085166068187e5b5660b49fe20b4
hw.ramSize: 1536
hw.trackBall: no
PlayStore.enabled: false
fastboot.forceColdBoot: no
hw.battery: yes
hw.cpu.ncore: 1
hw.sdCard: yes
tag.display: Google APIs
runtime.network.latency: none
hw.keyboard: yes
hw.sensors.proximity: yes
disk.dataPartition.size: 800M
hw.sensors.orientation: yes
avd.ini.encoding: UTF-8
hw.gpu.enabled: yes
我试过你这个可执行程序了,在我的测试机上也出现了同样的错误。。。
@yeyupiaoling 你编译的这个inference程序有修改inference.cc中的代码吗?我这边在一个小米 5的机器上是可以正确运行,不过forward时间不正确(多了1个数量级)。
WARNING: linker: /data/data/com.example.test/paddle/32_static/inference: unused DT entry: type 0xf arg 0x826
I1212 17:41:35.916736 7600 Util.cpp:166] commandline:
Time of init paddle 5.60583 ms.
Time of create from merged model file 74.652 ms.
Time of forward time 8594.93 ms.
``
@Xreki 我重新make
一次:
cmake .. \
-DANDROID_ABI=armeabi-v7a \
-DANDROID_STANDALONE_TOOLCHAIN=/home/wang/ubuntu/paddlepaddle/android/arm_standalone_toolchain_2 \
-DPADDLE_ROOT=/home/wang/ubuntu/paddlepaddle/android/install \
-DCMAKE_BUILD_TYPE=MinSizeRel
输出的日志,注意最后显示的警告:
-- Found Paddle host system: ubuntu, version: 16.04.3
-- Found Paddle host system's CPU: 2 cores
-- The CXX compiler identification is GNU 5.4.0
-- The C compiler identification is GNU 5.4.0
-- The Golang compiler identification is go1.8.1 linux/amd64
-- Check for working Golang compiler: /usr/local/go/bin/go
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Could NOT find Sphinx (missing: SPHINX_EXECUTABLE)
-- Found Git: /usr/bin/git (found version "2.7.4")
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Looking for pthread_create
-- Looking for pthread_create - not found
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - found
-- Found Threads: TRUE
-- Performing Test MMX_FOUND
-- Performing Test MMX_FOUND - Success
-- Performing Test SSE2_FOUND
-- Performing Test SSE2_FOUND - Success
-- Performing Test SSE3_FOUND
-- Performing Test SSE3_FOUND - Success
-- Performing Test AVX_FOUND
-- Performing Test AVX_FOUND - Failed
-- Performing Test AVX2_FOUND
-- Performing Test AVX2_FOUND - Failed
-- Protobuf protoc executable: /home/wang/ubuntu/paddlepaddle/paddle-src/paddle2/build/third_party/install/protobuf/bin/protoc
-- Protobuf library: /home/wang/ubuntu/paddlepaddle/paddle-src/paddle2/build/third_party/install/protobuf/lib/libprotobuf.a
-- Protobuf version: 3.1
-- Found PythonInterp: /usr/bin/python2.7 (found suitable version "2.7.12", minimum required is "2.7")
-- Found PythonLibs: /usr/lib/x86_64-linux-gnu/libpython2.7.so (found suitable version "2.7.12", minimum required is "2.7")
-- Found PY_pip: /usr/local/lib/python2.7/dist-packages/pip
-- Found PY_numpy: /usr/local/lib/python2.7/dist-packages/numpy
-- Found PY_wheel: /usr/lib/python2.7/dist-packages/wheel
-- Found PY_google.protobuf: /usr/local/lib/python2.7/dist-packages/google/protobuf
-- Found NumPy: /usr/local/lib/python2.7/dist-packages/numpy/core/include
-- Found ATLAS (include: /usr/include, library: /usr/lib/liblapack_atlas.so;/usr/lib/libcblas.so)
-- Found lapack in ATLAS (include: /usr/include/atlas)
-- BLAS library: /usr/lib/liblapack_atlas.so;/usr/lib/libcblas.so
-- Found SWIG: /usr/bin/swig3.0 (found version "3.0.8")
-- warp-ctc library: /home/wang/ubuntu/paddlepaddle/paddle-src/paddle2/build/third_party/install/warpctc/lib/libwarpctc.so
-- Looking for UINT64_MAX
-- Looking for UINT64_MAX - found
-- Looking for sys/types.h
-- Looking for sys/types.h - found
-- Looking for stdint.h
-- Looking for stdint.h - found
-- Looking for stddef.h
-- Looking for stddef.h - found
-- Check size of pthread_spinlock_t
-- Check size of pthread_spinlock_t - done
-- Check size of pthread_barrier_t
-- Check size of pthread_barrier_t - done
-- Performing Test C_COMPILER_SUPPORT_FLAG__fPIC
-- Performing Test C_COMPILER_SUPPORT_FLAG__fPIC - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__fPIC
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__fPIC - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__fno_omit_frame_pointer
-- Performing Test C_COMPILER_SUPPORT_FLAG__fno_omit_frame_pointer - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__fno_omit_frame_pointer
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__fno_omit_frame_pointer - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wall
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wall - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wall
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wall - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wextra
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wextra - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wextra
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wextra - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Werror
-- Performing Test C_COMPILER_SUPPORT_FLAG__Werror - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Werror
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Werror - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wnon_virtual_dtor
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wnon_virtual_dtor - Failed
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wnon_virtual_dtor
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wnon_virtual_dtor - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wdelete_non_virtual_dtor
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wdelete_non_virtual_dtor - Failed
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wdelete_non_virtual_dtor
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wdelete_non_virtual_dtor - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_unused_parameter
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_unused_parameter - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_unused_parameter
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_unused_parameter - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_unused_function
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_unused_function - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_unused_function
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_unused_function - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_literal_suffix
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_literal_suffix - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_literal_suffix
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_literal_suffix - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_sign_compare
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_sign_compare - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_sign_compare
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_sign_compare - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_unused_local_typedefs
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_unused_local_typedefs - Success
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_unused_local_typedefs
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_unused_local_typedefs - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_parentheses_equality
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_parentheses_equality - Failed
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_parentheses_equality
-- Performing Test CXX_COMPILER_SUPPORT_FLAG__Wno_error_parentheses_equality - Failed
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_unused_function
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_unused_function - Success
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_array_bounds
-- Performing Test C_COMPILER_SUPPORT_FLAG__Wno_error_array_bounds - Success
CMake Warning at cmake/version.cmake:20 (message):
Cannot add paddle version from git tag
Call Stack (most recent call first):
CMakeLists.txt:141 (include)
-- Paddle version is 0.0.0
-- Skip compiling with MKLDNNMatrix
-- Skip compiling with MKLDNNLayers and MKLDNNActivations
CMake Deprecation Warning at /usr/local/share/cmake-3.8/Modules/UseSWIG.cmake:226 (message):
SWIG_ADD_MODULE is deprecated. Use SWIG_ADD_LIBRARY instead.
Call Stack (most recent call first):
paddle/api/CMakeLists.txt:56 (SWIG_ADD_MODULE)
-- Configuring done
-- Generating done
CMake Warning:
Manually-specified variables were not used by the project:
ANDROID_ABI
ANDROID_STANDALONE_TOOLCHAIN
PADDLE_ROOT
-- Build files have been written to: /home/wang/ubuntu/paddlepaddle/paddle-src/paddle2/build
@hedaoyuan 至少不会出错,可以执行,你有做好的inference吗,给我试试
@yeyupiaoling 你又在Paddle目录执行demo的cmake了。。。
这是我编译好的inference可执行程序。另外,你编译出来的可执行程序比我编译出来的要大好多。你的有58157KB
,我的只有7066KB
。
PS:我在三个测试手机上试了你的inference,都出现同样的错误。。。
@Xreki 这........就很尴尬了
我重新构建了独立工具链和配置交叉编译参数,再make
发现inference小了很多,只有7.9M,你的是7.2M
不过还是出错,但是错误代码不一样了,这是我的inference.zip
你的inference在我这边的没有出现paddle forward error!
输出日志如下:
WARNING: linker: /data/local/tmp/inference: unused DT entry: type 0xf arg 0x6fa
I1212 19:53:43.475316 2721 Util.cpp:166] commandline:
Time of init paddle 331.797 ms.
Time of create from merged model file 3728.46 ms.
Time of forward time 83965.5 ms.
我在Mobile下的Mobile2/benchmark/tool/C/build
的make
了,日志如下:
-- The CXX compiler identification is GNU 4.9.0
-- The C compiler identification is GNU 4.9.0
-- Check for working CXX compiler: /home/wang/ubuntu/paddlepaddle/android/arm_standalone_toolchain_2/bin/arm-linux-androideabi-g++
-- Check for working CXX compiler: /home/wang/ubuntu/paddlepaddle/android/arm_standalone_toolchain_2/bin/arm-linux-androideabi-g++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Check for working C compiler: /home/wang/ubuntu/paddlepaddle/android/arm_standalone_toolchain_2/bin/arm-linux-androideabi-gcc
-- Check for working C compiler: /home/wang/ubuntu/paddlepaddle/android/arm_standalone_toolchain_2/bin/arm-linux-androideabi-gcc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Found PaddlePaddle (include: /home/wang/ubuntu/paddlepaddle/android/install/include/paddle; library: /home/wang/ubuntu/paddlepaddle/android/install/lib/armeabi-v7a/libpaddle_capi_shared.so)
-- Configuring done
-- Generating done
-- Build files have been written to: /media/wang/软件/ubuntu/Mobile-test/Mobile2/benchmark/tool/C/build
root@yeyupiaoling:/media/wang/软件/ubuntu/Mobile-test/Mobile2/benchmark/tool/C/b
uild# make
Scanning dependencies of target inference
[ 50%] Building CXX object CMakeFiles/inference.dir/inference.cc.o
[100%] Linking CXX executable inference
[100%] Built target inference
这个真的很奇怪了,我尝试了几种能想到的编译组合,都无法复现你的问题。你试下https://github.com/PaddlePaddle/Mobile/wiki 的Paddle库,这个库我们在业务线上验证过的。
@Xreki 肯定是我的问题,我周末再从头彻彻底底做一遍,
多谢你的支持。我们肯定也是有支持的不够完善的地方,才出现了这种无法定位的问题。请多多提意见,:smile:
另外,PaddlePaddle在送周年礼物,不要错过:https://github.com/PaddlePaddle/Paddle/issues/6542
我在这个教程的最后一步的命令,我执行了这个代码
./inference --merged_model ./mobilenet.paddle --input_size 784
,因为我是使用paddlepaddle的手写数字识别的,报以下的错误,那条命令的的第一个参数是什么?