Closed vvasily closed 2 years ago
I got the following issue:
ERROR: /home/user/.cache/bazel/_bazel_root/3f4b1ab0378fd30fe519cc57222e9b37/external/org_tensorflow/tensorflow/core/BUILD:1025:11: C++ compilation of rule '@org_tensorflow//tensorflow/core:portable_tensorflow_lib_lite' failed (Exit 1): clang failed: error executing command external/androidndk/ndk/toolchains/llvm/prebuilt/linux-x86_64/bin/clang -gcc-toolchain external/androidndk/ndk/toolchains/aarch64-linux-android-4.9/prebuilt/linux-x86_64 -target ... (remaining 104 argument(s) skipped)
Use --sandbox_debug to see verbose messages from the sandbox clang failed: error executing command external/androidndk/ndk/toolchains/llvm/prebuilt/linux-x86_64/bin/clang -gcc-toolchain external/androidndk/ndk/toolchains/aarch64-linux-android-4.9/prebuilt/linux-x86_64 -target ... (remaining 104 argument(s) skipped)
Use --sandbox_debug to see verbose messages from the sandbox external/org_tensorflow/tensorflow/core/platform/tensor_coding.cc:42:10: error: call to member function 'append' is ambiguous out->append(strings[i]);
external/androidndk/ndk/sources/cxx-stl/llvm-libc++/include/string:938:19: note: candidate function
basic_string& append(const basic_string& __str);
^
external/androidndk/ndk/sources/cxx-stl/llvm-libc++/include/string:940:19: note: candidate function
basic_string& append(__self_view __sv) { return append(__sv.data(), __sv.size()); }
^
external/androidndk/ndk/sources/cxx-stl/llvm-libc++/include/string:951:19: note: candidate function not viable: no known conversion from 'const tensorflow::tstring' to 'const std::__ndk1::basic_string<char, std::__ndk1::char_traits<char>, std::__ndk1::allocator<char> >::value_type *' (aka 'const char *') for 1st argument
basic_string& append(const value_type* __s);
^
external/androidndk/ndk/sources/cxx-stl/llvm-libc++/include/string:985:19: note: candidate function not viable: no known conversion from 'const tensorflow::tstring' to 'initializer_list<std::__ndk1::basic_string<char, std::__ndk1::char_traits<char>, std::__ndk1::allocator<char> >::value_type>' (aka 'initializer_list<char>') for 1st argument
basic_string& append(initializer_list<value_type> __il) {return append(__il.begin(), __il.size());}
^
external/androidndk/ndk/sources/cxx-stl/llvm-libc++/include/string:949:19: note: candidate function template not viable: requires at least 2 arguments, but 1 was provided
append(const _Tp& __t, size_type __pos, size_type __n=npos);
^
external/androidndk/ndk/sources/cxx-stl/llvm-libc++/include/string:965:5: note: candidate function template not viable: requires 2 arguments, but 1 was provided
append(_InputIterator __first, _InputIterator __last) {
^
external/androidndk/ndk/sources/cxx-stl/llvm-libc++/include/string:979:5: note: candidate function template not viable: requires 2 arguments, but 1 was provided
append(_ForwardIterator __first, _ForwardIterator __last) {
^
external/androidndk/ndk/sources/cxx-stl/llvm-libc++/include/string:950:19: note: candidate function not viable: requires 2 arguments, but 1 was provided
basic_string& append(const value_type* __s, size_type __n);
^
external/androidndk/ndk/sources/cxx-stl/llvm-libc++/include/string:952:19: note: candidate function not viable: requires 2 arguments, but 1 was provided
basic_string& append(size_type __n, value_type __c);
^
external/androidndk/ndk/sources/cxx-stl/llvm-libc++/include/string:941:19: note: candidate function not viable: requires at least 2 arguments, but 1 was provided
basic_string& append(const basic_string& __str, size_type __pos, size_type __n=npos);
^
1 error generated.
Hi @sgowroji ,
my pbtxt
node { calculator: "OpenCvVideoDecoderCalculator" input_side_packet: "INPUT_FILE_PATH:input_video_path" output_stream: "VIDEO:input_video" output_stream: "VIDEO_PRESTREAM:input_video_header" }
node { calculator: "ColorConvertCalculator" input_stream: "RGB_IN:input_video" output_stream: "GRAY_OUT:output_video" }
node { calculator: "ImageTransformationCalculator" input_stream: "IMAGE:output_video" output_stream: "IMAGE:transformed_output_video" node_options: { [type.googleapis.com/mediapipe.ImageTransformationCalculatorOptions] { output_width: 224 output_height: 224 } } }
node { calculator: "ImageCroppingCalculator" input_stream: "IMAGE:transformed_output_video" output_stream: "IMAGE:cropped_output_video" node_options: { [type.googleapis.com/mediapipe.ImageCroppingCalculatorOptions] { width: 112 height: 112 } } }
node { calculator: "TfLiteConverterCalculator" input_stream: "IMAGE:cropped_output_video" output_stream: "TENSORS:image_tensor" node_options: { [type.googleapis.com/mediapipe.TfLiteConverterCalculatorOptions] { use_custom_normalization: true custom_div: 43.044 custom_sub: 2.465 } } }
node { calculator: "TfLiteInferenceCalculator" input_stream: "TENSORS:image_tensor" output_stream: "TENSORS:decoded_tensors" node_options: { [type.googleapis.com/mediapipe.TfLiteInferenceCalculatorOptions] { model_path: "mediapipe/modules/face_detection/videonet_full.tflite" use_gpu: false } } }
node { calculator: "TfLiteTensorsToFloatsCalculator" input_stream: "TENSORS:decoded_tensors" output_stream: "FLOATS:inference_floats" }
there are no any changes in code
@sgowroji I have found the workaround for loading of tflite model with Flex:
1) Add "@org_tensorflow//tensorflow/lite/delegates/flex:delegate", to tflite_inference_calculator: }) + select({ "//conditions:default": [], "//mediapipe:android": [ "//mediapipe/util/android/file/base", "@org_tensorflow//tensorflow/lite/delegates/nnapi:nnapi_delegate", "@org_tensorflow//tensorflow/lite/delegates/flex:delegate", ], })
2) in ~/.cache/bazel/_bazel_root/3f4b1ab0378fd30fe519cc57222e9b37/external/org_tensorflow/tensorflow/core/platform/tensor_coding.cc 42 line: out->append(static_cast\<std::string>(strings[i]));
in ~/.cache/bazel/_bazel_root/3f4b1ab0378fd30fe519cc57222e9b37/external/org_tensorflow/tensorflow/core/lib/io/buffered_inputstream.cc 194 line: result->append(staticcast\<std::string>(buf));
But I'm not sure that it's proper way
Hi @vvasily , There's a chance that your model is not available at the path you provided, try to look what you need to do make sure your iOS has custom.tflite model.
For example for iris demo models are ensured by including them into "data": https://github.com/google/mediapipe/blob/master/mediapipe/examples/ios/iristrackinggpu/BUILD#L66-L67
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you.
Closing as stale. Please reopen if you'd like to work on this further.
I try to build app for Android to invoke tflite model in TfLiteInferenceCalculator but during the LoadModel calling in tflite_inference_calculator.cc I get the issue that tflite model cannot be loaded with FlexPad op.
It seems that I need to add tensorflow/lite/delegates/flex:delegate build dependences. But with this dependences the app is not built. I tried to add flex:delegate to tflite_inference_calculator in BUILD but got the error.
How to fix the issue to support flex ops in tflite?