iwatake2222 / play_with_tflite

Sample projects for TensorFlow Lite in C++ with delegates such as GPU, EdgeTPU, XNNPACK, NNAPI
Apache License 2.0
357 stars 79 forks source link

How to run Yolov5 on Qualcomm RB5 device with GPU? #56

Closed tiensu closed 2 years ago

tiensu commented 2 years ago

Environment (Hardware)

Project Name

pj_tflite_det_yolov5

Issue Details

Using information from the README.md file, I can successfully build and run model Yolov5 on a Qualcomm RB5 device. I understood that, by defauly, the model runs on CPU. Now, I want to run it on GPU. So, I modified _inferencehelper as follows: inferencehelper.reset(InferenceHelper::Create(InferenceHelper::kTensorflowLiteGpu)); Then run CMake command, but there is an error.

How to Reproduce

  1. Clone project into RB5 device: $ git clone https://github.com/iwatake2222/play_with_tflite.git
  2. $ cd play_with_tflite/pj_tflite_det_yolov5
  3. Open file: vi image_processor/detection_engine.cpp
  4. Comment line 84: inferencehelper.reset(InferenceHelper::Create(InferenceHelper::kTensorflowLiteXnnpack));
  5. Uncomment line 85: inferencehelper.reset(InferenceHelper::Create(InferenceHelper::kTensorflowLiteGpu));
  6. $ mkdir build && cd build
  7. $ cmake .. -DINFERENCE_HELPER_ENABLE_TFLITE_DELEGATE_EDGETPU=off -DINFERENCE_HELPER_ENABLE_TFLITE_DELEGATE_GPU=on -DINFERENCE_HELPER_ENABLE_TFLITE_DELEGATE_XNNPACK=off [main] CMAKE_SYSTEM_PROCESSOR = aarch64, BUILD_SYSTEM = aarch64
  8. Erorr is displayed.

Error Log

image

Additional Information

Reference link to RB5 device: https://developer.qualcomm.com/qualcomm-robotics-rb5-kit/hardware-reference-guide

Please help me fix this problem! Thank you very much!

iwatake2222 commented 2 years ago

Unfortunately, I provide pre-built tflite+GPU library only for Android because I can't prepare build environment for other platforms. If you build libtensorflowlite_gpu_delegate.so in your environment and place it to play_with_tflite\InferenceHelper\third_party\tflite_prebuilt\aarch64 ,it may work.

tiensu commented 2 years ago

Thank you, iwatake2222!