gmlwns2000 / TensorFlowSharp

TensorFlow API for Xamarin Families (Windows, Android, UWP, and more)
MIT License
7 stars 5 forks source link

Build an android tensorflow binary with support DT_BOOL #3

Open DumDumin opened 7 years ago

DumDumin commented 7 years ago

Hey guys,

I have the following error by running a network on android:

Unhandled Exception:TensorFlow.TFException: No OpKernel was registered to support Op 'Switch' with these attrs. Registered devices: [CPU], Registered kernels:device='GPU'; T in [DT_STRING]device='GPU'; T in [DT_BOOL]device='GPU'; T in [DT_INT32]device='GPU'; T in [DT_FLOAT]device='CPU'; T in [DT_FLOAT]device='CPU'; T in [DT_INT32][[Node: model/comb1/comb1/cond/Switch = Switch[T=DT_BOOL](ph/training_ph, ph/training_ph)]] occurred

It looks like the type bool is not support on the cpu version of tensorflow. I found this fix: https://github.com/tensorflow/models/issues/1740 but wasnt able to manage it....

gmlwns2000 commented 7 years ago

Did you build own native library?

DumDumin commented 7 years ago

No not yet... I tried to use the gpu version provided by miguel. But there is something else going wrong..

Can anyone provide the built native library for bool support? I suppose I am not the only one having trouble with that. Would spare me a lot of time getting started with building the c-lib myself.

Thanks for any help :)

gmlwns2000 commented 7 years ago

I builted bool supported native lib. Google Drive It worked fine with my own model (included BatchNorm Layers that have Switch Ops).

And you can find gpu supported binaries in GitIgnoredDatas.zip too. It is the zip file that I uploaded. You just include into TensorFlowSharp.Windows instead of CPU version of libtensorflow.dll. Follow files should be included to use gpu on window.

If you want to get dll yourself, copy _pywrap_tensorflow_internal.pyd from python package. And rename .pyd as libtensorflow.dll. Also you need dependency dlls too.

DumDumin commented 7 years ago

Thanks man! Your new built native library is working for me :)

gmlwns2000 commented 7 years ago

Great!

hgffly commented 7 years ago

Hi @gmlwns2000 , I have tried your built libraries and it's working for me. And I'm pretty interested how to build libraries like that as well. May you share how to make it ? Thanks

gmlwns2000 commented 7 years ago

Hi @hgffly , This is how to build tensorflow C lib for android. I hope this will help you.

  1. Make tensorflow for android support DT_BOOL. Edit following line #define TF_CALL_bool(m) to #define TF_CALL_bool(m) m(bool) Source: stackoverflow
  2. Change build option to appear TF_ functions in libtensorflow_inference.so. This file defines which functions should be export. Add a line `TF_` in global definition
    VERS_1.0 {
    global:
    Java_*;
    JNI_OnLoad;
    JNI_OnUnload;
    TF_*;
    local:
    *;
    };
  3. Build a libtensorflow_inference.so. You should build several time to support platforms (arm64-v8a, armeabi-v7a, x86, x86_64) Enter the command on the root of tensorflow dir.
    bazel build -c opt //tensorflow/contrib/android:libtensorflow_inference.so \
    --crosstool_top=//external:android/crosstool \
    --host_crosstool_top=@bazel_tools//tools/cpp:toolchain \
    --cpu=armeabi-v7a
    • To change target platform, you can just change --cpu=armeabi-v7a.
    • If you ran out of memory while building, add --verbose_failures --local_resources 4096,4.0,1.0 -j 1
  4. So now, you can find tensorflow_inference.so in bazel-bin/tensorflow/contrib/android. Copy libtensorflow_inference.so into other place, go back to step 3, change cpu target, and build it again for several platforms.
  5. After building whole binary for each platform, rename libtensorflow_inference.so to libtensorflow.so. And then copy those files into correct place (ex: TensorflowSharp/GitIgnoredData/android/armeabi-v7a/)

My solution is not official and not clear too, if you guys have some better solution for building native library for android, please share how to make it :) Thanks

hgffly commented 7 years ago

Hi gmlwns2000,

Appreciate you for sharing how to build the library for android But I encountered some errors in step 3: ERROR: /home/wilson/.cache/bazel/_bazel_wilson/416ee1e5c1dd220e496a2567f04ee5b5/external/protobuf/BUILD:113:1: C++ compilation of rule '@protobuf//:protobuf' failed: false failed: error executing command /bin/false -MD -MF bazel-out/stub_armeabi-v7a-py3-opt/bin/external/protobuf/_objs/protobuf/external/protobuf/src/google/protobuf/wrappers.pb.pic.d ... (remaining 26 argument(s) skipped): com.google.devtools.build.lib.shell.BadExitStatusException: Process exited with status 1. Target //tensorflow/contrib/android:libtensorflow_inference.so failed to build

I may need to investigate how to resolve it After resolving it, I will try it soon, thanks Which version of Tensorflow do you use? My Tensorflow is r1.2

Thanks for help!

2017-08-02 12:25 GMT+08:00 AinL notifications@github.com:

Hi @hgffly https://github.com/hgffly , This is how to build tensorflow C lib for android. I hope this will help you.

  1. Make tensorflow for android support DT_BOOL. Edit following line https://github.com/tensorflow/tensorflow/blob/v1.1.0-rc2/tensorflow/core/framework/register_types.h#L125 #define TF_CALL_bool(m) to #define TF_CALL_bool(m) m(bool) Source: stackoverflow https://stackoverflow.com/questions/40855271/no-opkernel-was-registered-to-support-op-switch-with-these-attrs-on-ios/43627334#43627334
  2. Change build option to appear TF_ functions in libtensorflow_inference.so. This file https://github.com/tensorflow/tensorflow/blob/v1.1.0-rc2/tensorflow/contrib/android/jni/version_script.lds defines which functions should be export. Add a line TF_ in global definition

VERS1.0 { global: Java; JNI_OnLoad; JNIOnUnload; TF; local: *; };

  1. Build a libtensorflow_inference.so. You should build several time to support platforms (arm64-v8a, armeabi-v7a, x86, x86_64) Enter the command on the root of tensorflow dir.

bazel build -c opt //tensorflow/contrib/android:libtensorflow_inference.so \ --crosstool_top=//external:android/crosstool \ --host_crosstool_top=@bazel_tools//tools/cpp:toolchain \ --cpu=armeabi-v7a

  • To change target platform, you can just change --cpu=armeabi-v7a.
  • If you ran out of memory while building, add --verbose_failures --local_resources 4096,4.0,1.0 -j 1

    1. So now, you can find tensorflow_inference.so in bazel-bin/tensorflow/contrib/android. Copy libtensorflow_inference.so into other place, go back to step 3, change cpu target, and build it again for several platforms.
    2. After building whole binary for each platform, rename libtensorflow_inference.so to libtensorflow.so. And then copy those files into correct place (ex: TensorflowSharp/GitIgnoredData/android/armeabi-v7a/)

My solution is not official and not clear too, if you guys have some better solution for building native library for android, please share how to make it :) Thanks

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/gmlwns2000/TensorFlowSharp/issues/3#issuecomment-319563717, or mute the thread https://github.com/notifications/unsubscribe-auth/ATOuLfJABWK8pd35Avra5WxctloOHN39ks5sT_o4gaJpZM4OP5YF .

gmlwns2000 commented 7 years ago

Hi, @hgffly ,

I used tensorflow 1.2.0-r0 for build a library. I'm sorry but I am not good at bazel building, so I can't help you :(

hgffly commented 7 years ago

Hi gmlwns2000,

It's fine, you have already helped a lot, thanks so much!

http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail 不含病毒。www.avg.com http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail <#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>

2017-08-03 19:18 GMT+08:00 AinL notifications@github.com:

Hi, @hgffly https://github.com/hgffly ,

I used tensorflow 1.2.0-r0 for build a library. I'm sorry but I am not good at bazel building, so I can't help you :(

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/gmlwns2000/TensorFlowSharp/issues/3#issuecomment-319941138, or mute the thread https://github.com/notifications/unsubscribe-auth/ATOuLXlhP49eyPaBuHmPOdfOi0oXbLlHks5sUayYgaJpZM4OP5YF .