Closed tiensu closed 2 years ago
Unfortunately, I provide pre-built tflite+GPU library only for Android because I can't prepare build environment for other platforms.
If you build libtensorflowlite_gpu_delegate.so
in your environment and place it to play_with_tflite\InferenceHelper\third_party\tflite_prebuilt\aarch64
,it may work.
Thank you, iwatake2222!
Environment (Hardware)
Project Name
pj_tflite_det_yolov5
Issue Details
Using information from the README.md file, I can successfully build and run model Yolov5 on a Qualcomm RB5 device. I understood that, by defauly, the model runs on CPU. Now, I want to run it on GPU. So, I modified _inferencehelper as follows: inferencehelper.reset(InferenceHelper::Create(InferenceHelper::kTensorflowLiteGpu)); Then run CMake command, but there is an error.
How to Reproduce
Error Log
Additional Information
Reference link to RB5 device: https://developer.qualcomm.com/qualcomm-robotics-rb5-kit/hardware-reference-guide
Please help me fix this problem! Thank you very much!