Closed RichardLiee closed 4 years ago
Could you try removing the '.index' in CHECKPOINT_PATH?
It works, Thank you very much.!
@pkulzc hi, pkulzc I can export tflite only in inference type FLOAT, my pretrain model downloading from http://download.tensorflow.org/models/object_detection/ssdlite_mobilenet_v2_coco_2018_05_09.tar.gz
what's the different between QUANTIZED_UINT8 and FLOAT, like efficient that work on phone cpu and model size.
how can I export ssdlite_mobilenet_v2 tflite with QUANTIZED_UINT8 inference type.
Thanks.
I meet another issue with app bazel build:
this is recommend operate: <<<<<<<<<<<<<<<<<<<<<<<<<<< bazel build -c opt --config=android_arm{,64} --cxxopt='--std=c++11' "//tensorflow/contrib/lite/examples/android:tflite_demo" <<<<<<<<<<<<<<<<<<<<<<<<<<<<
but here is my situation: <<<<<<<<<<<<<<<<<<<<< $ bazel build -c opt --config=android_arm{,64} --cxxopt='--std=c++11' "//tensorflow/contrib/lite/examples/android:tflite_demo" WARNING: The following configs were expanded more than once: [android]. For repeatable flags, repeats are counted twice and may lead to unexpected behavior. WARNING: option '--crosstool_top' was expanded to from both option '--config=cuda' (source /home/vip/tensorflow/.tf_configure.bazelrc) and option '--config=android_arm' (source command line options) WARNING: option '--cpu' was expanded to from both option '--config=android_arm' (source command line options) and option '--config=android_arm64' (source command line options) WARNING: option '--fat_apk_cpu' was expanded to from both option '--config=android_arm' (source command line options) and option '--config=android_arm64' (source command line options) ERROR: No default_toolchain found for cpu 'arm64-v8a'. Valid cpus are: [ k8, local, armeabi-v7a, x64_windows, x64_windows_msvc, x64_windows_msys, s390x, ios_x86_64, ] INFO: Elapsed time: 1.870s INFO: 0 processes. FAILED: Build did NOT complete successfully (3 packages loaded) <<<<<<<<<<<<<<<<<<<<<<<<<<<<
bazel version: 0.14.1
Please try bazel build -c opt --cxxopt='--std=c++11' --fat_apk_cpu=x86,x86_64,arm64-v8a,armeabi-v7a \ //tensorflow/contrib/lite/examples/android:tflite_demo
Did this work?
apk complie from tensorflow master branch doesn't works on phone without any modify,
<<<<<<<<<<
03-10 05:03:13.130 24916 24916 E AndroidRuntime: Process: org.tensorflow.lite.demo, PID: 24916
03-10 05:03:13.130 24916 24916 E AndroidRuntime: java.lang.UnsatisfiedLinkError: No implementation found for long org.tensorflow.lite.NativeInterpreterWrapper.createErrorReporter(int) (tried Java_org_tensorflow_lite_NativeInterpreterWrapper_createErrorReporter and Java_org_tensorflow_lite_NativeInterpreterWrapper_createErrorReporter__I)
03-10 05:03:13.130 24916 24916 E AndroidRuntime: at org.tensorflow.lite.NativeInterpreterWrapper.createErrorReporter(Native Method)
03-10 05:03:13.130 24916 24916 E AndroidRuntime: at org.tensorflow.lite.NativeInterpreterWrapper.
apk complie from branch origin/r1.9 works well, but replace my litessd_mobilenetv2 tflite doesn't works, branch origin/r1.9 code seem like work well, but too old for litessd_mobilenetv2.
<<<<<<<<<<<<<<<<<<<< 03-10 05:19:23.929 27407 27407 E AndroidRuntime: FATAL EXCEPTION: main 03-10 05:19:23.929 27407 27407 E AndroidRuntime: Process: org.tensorflow.lite.demo, PID: 27407 03-10 05:19:23.929 27407 27407 E AndroidRuntime: java.lang.RuntimeException: java.lang.IllegalArgumentException: Internal error: Cannot create interpreter: Didn't find custom op for name 'TFLite_Detection_PostProcess' with version 1 03-10 05:19:23.929 27407 27407 E AndroidRuntime: Registration failed. 03-10 05:19:23.929 27407 27407 E AndroidRuntime: 03-10 05:19:23.929 27407 27407 E AndroidRuntime: at org.tensorflow.demo.TFLiteObjectDetectionAPIModel.create(TFLiteObjectDetectionAPIModel.java:175) 03-10 05:19:23.929 27407 27407 E AndroidRuntime: at org.tensorflow.demo.DetectorActivity.onPreviewSizeChosen(DetectorActivity.java:109) 03-10 05:19:23.929 27407 27407 E AndroidRuntime: at org.tensorflow.demo.CameraActivity$5.onPreviewSizeChosen(CameraActivity.java:362) 03-10 05:19:23.929 27407 27407 E AndroidRuntime: at org.tensorflow.demo.CameraConnectionFragment.setUpCameraOutputs(CameraConnectionFragment.java:401) 03-10 05:19:23.929 27407 27407 E AndroidRuntime: at org.tensorflow.demo.CameraConnectionFragment.openCamera(CameraConnectionFragment.java:408) 03-10 05:19:23.929 27407 27407 E AndroidRuntime: at org.tensorflow.demo.CameraConnectionFragment.access$000(CameraConnectionFragment.java:64) 03-10 05:19:23.929 27407 27407 E AndroidRuntime: at org.tensorflow.demo.CameraConnectionFragment$1.onSurfaceTextureAvailable(CameraConnectionFragment.java:95) <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<,
Thanks.
@RichardLiee Thanks for the error message. This error message seems to be when you push apk to the phone. What phone/platform and architecture are you building for?
phone:qualcomm sdm845/ android 8.1 build commend I try tensorflow recommend :bazel build -c opt --config=android_arm{,64} --cxxopt='--std=c++11' "//tensorflow/contrib/lite/examples/android:tflite_demo"
and yours :bazel build -c opt --cxxopt='--std=c++11' --fat_apk_cpu=x86,x86_64,arm64-v8a,armeabi-v7a //tensorflow/contrib/lite/examples/android:tflite_demo these two commend complie apk happen the same error when work on phone.
@RichardLiee I faced the same problem with you .
Acknowledged. Please try once again. We pushed some updates and let us know whats the new error message
@RichardLiee I use Andoid studio and build this project successfully. The command line blocked me either.
It's works, thanks a lot.
@RichardLiee were you able to get the model running on android? I'm trying to do the same thing but after converting the model is outputting unexpected predictions ( negative numbers, negative decimal numbers for the class nodes ) I'm testing this using the ssdlite model and the tflite demo too.
@normandra the pre-train works fine on phone, but after fine-tune the result output error value.
this is happening to me too, after further testing it seems that it doesn't happen all the time and a simple checking of the output can fix this. I think you should reopen the issue @RichardLiee
issue : fine tune model produce error result.
I try to convert ssdlite_mobilenet_v2_coco_2018_05_09 model to tflite use the following steps:
$ export CONFIG_FILE=/tmp/ssdlite_mobilenet_v2_coco_2018_05_09/pipeline.config $ export CHECKPOINT_PATH=/tmp/ssdlite_mobilenet_v2_coco_2018_05_09/model.ckpt $ export OUTPUT_DIR=/tmp/tflite
And I get an error: Traceback (most recent call last): File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 1334, in _do_call return fn(*args) File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 1319, in _run_fn options, feed_dict, fetch_list, target_list, run_metadata) File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 1407, in _call_tf_sessionrun run_metadata) tensorflow.python.framework.errors_impl.InvalidArgumentError: Assign requires shapes of both tensors to match. lhs shape= [1,1,256,546] rhs shape= [1,1,1280,546] [[{{node save/Assign_22}} = Assign[T=DT_FLOAT, _class=["loc:@BoxPredictor_1/ClassPredictor/weights"], use_locking=true, validate_shape=true, _device="/job:localhost/replica:0/task:0/device:CPU:0"](BoxPredictor_1/ClassPredictor/weights, save/RestoreV2:22)]]
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/training/saver.py", line 1546, in restore {self.saver_def.filename_tensor_name: save_path}) File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 929, in run run_metadata_ptr) File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 1152, in _run feed_dict_tensor, options, run_metadata) File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 1328, in _do_run run_metadata) File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/client/session.py", line 1348, in _do_call raise type(e)(node_def, op, message) tensorflow.python.framework.errors_impl.InvalidArgumentError: Assign requires shapes of both tensors to match. lhs shape= [1,1,256,546] rhs shape= [1,1,1280,546] [[node save/Assign_22 (defined at /home/models/research/object_detection/export_tflite_ssd_graph_lib.py:255) = Assign[T=DT_FLOAT, _class=["loc:@BoxPredictor_1/ClassPredictor/weights"], use_locking=true, validate_shape=true, _device="/job:localhost/replica:0/task:0/device:CPU:0"](BoxPredictor_1/ClassPredictor/weights, save/RestoreV2:22)]]
Caused by op 'save/Assign_22', defined at:
File "object_detection/export_tflite_ssd_graph.py", line 137, in
InvalidArgumentError (see above for traceback): Assign requires shapes of both tensors to match. lhs shape= [1,1,256,546] rhs shape= [1,1,1280,546] [[node save/Assign_22 (defined at /home/models/research/object_detection/export_tflite_ssd_graph_lib.py:255) = Assign[T=DT_FLOAT, _class=["loc:@BoxPredictor_1/ClassPredictor/weights"], use_locking=true, validate_shape=true, _device="/job:localhost/replica:0/task:0/device:CPU:0"](BoxPredictor_1/ClassPredictor/weights, save/RestoreV2:22)]]
System information .ssdlite_mobilenet_v2_coco_2018_05_09 .Host ubuntu 16.04, using docker .tensorflow version: build from source '1.11.0-rc1' .cpu only
I didn't find anyone else who had the same error as me, so I am confused, who can help me solve this problem? I will be very appreciate!
While using
python object_detection\export_tflite_ssd_graph.py --pipeline_config=inference_graph\pipeline.config --trained_checkpoint_prefix=\training\model.ckpt-122 --out_dir=object_detection\inference_graph --add_postprocessing_op=true
I get the error
TypeError: Expected binary or unicode string, got None
Trying to convert a model to TFLite for Android app, any advice?
While using
python object_detection\export_tflite_ssd_graph.py --pipeline_config=inference_graph\pipeline.config --trained_checkpoint_prefix=\training\model.ckpt-122 --out_dir=object_detection\inference_graph --add_postprocessing_op=true
I get the error
TypeError: Expected binary or unicode string, got None
Trying to convert a model to TFLite for Android app, any advice?
I'm also facing the same problem. If you solved it then please let me know
problem. If you solved it then please let me know
Could you get how to get tflite for android app ?
Hi There, We are checking to see if you still need help on this, as this seems to be considerably old issue. Please update this issue with the latest information, code snippet to reproduce your issue and error you are seeing. If we don't hear from you in the next 7 days, this issue will be closed automatically. If you don't need help on this issue any more, please consider closing this.
System information
I export grapy which ssdlite_mobilenet_v2_coco, fine tune the pretrain model from model zoo ssdlite_mobilenet_v2_coco, https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md here is my config script:
<<<<<<<<<<<<<< CONFIG_FILE=/home/vip/TisanBrain/sample/objd/models/research/object_detection/samples/configs/ssdlite_mobilenet_v2_coco.config CHECKPOINT_PATH=/home/vip/fastData/objd/ssdlite_mobilenet_v2_coco_2018_05_09/model.ckpt-50000.index
OUTPUT_DIR=/home/vip/fastData/objd/ssdlite_mobilenet_v2_coco_2018_05_09/
python object_detection/export_tflite_ssd_graph.py \ --pipeline_config_path=${CONFIG_FILE} \ --trained_checkpoint_prefix=${CHECKPOINT_PATH} \ --output_directory=$OUTPUT_DIR \ --add_postprocessing_op=true <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<,
here is error log:
2018-08-14 19:19:11.789173: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 AVX512F FMA 2018-08-14 19:19:12.615432: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1392] Found device 0 with properties: name: GeForce GTX 1080 Ti major: 6 minor: 1 memoryClockRate(GHz): 1.582 pciBusID: 0000:17:00.0 totalMemory: 10.92GiB freeMemory: 10.76GiB 2018-08-14 19:19:12.735887: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1392] Found device 1 with properties: name: GeForce GTX 1080 Ti major: 6 minor: 1 memoryClockRate(GHz): 1.582 pciBusID: 0000:65:00.0 totalMemory: 10.92GiB freeMemory: 10.76GiB 2018-08-14 19:19:12.736823: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1471] Adding visible gpu devices: 0, 1 2018-08-14 19:19:13.210824: I tensorflow/core/common_runtime/gpu/gpu_device.cc:952] Device interconnect StreamExecutor with strength 1 edge matrix: 2018-08-14 19:19:13.210870: I tensorflow/core/common_runtime/gpu/gpu_device.cc:958] 0 1 2018-08-14 19:19:13.210878: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 0: N Y 2018-08-14 19:19:13.210883: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 1: Y N 2018-08-14 19:19:13.214277: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1084] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10411 MB memory) -> physical GPU (device: 0, name: GeForce GTX 1080 Ti, pci bus id: 0000:17:00.0, compute capability: 6.1) 2018-08-14 19:19:13.317100: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1084] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:1 with 10410 MB memory) -> physical GPU (device: 1, name: GeForce GTX 1080 Ti, pci bus id: 0000:65:00.0, compute capability: 6.1) 2018-08-14 19:19:14.348039: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1471] Adding visible gpu devices: 0, 1 2018-08-14 19:19:14.348136: I tensorflow/core/common_runtime/gpu/gpu_device.cc:952] Device interconnect StreamExecutor with strength 1 edge matrix: 2018-08-14 19:19:14.348144: I tensorflow/core/common_runtime/gpu/gpu_device.cc:958] 0 1 2018-08-14 19:19:14.348151: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 0: N Y 2018-08-14 19:19:14.348156: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 1: Y N 2018-08-14 19:19:14.348320: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1084] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10411 MB memory) -> physical GPU (device: 0, name: GeForce GTX 1080 Ti, pci bus id: 0000:17:00.0, compute capability: 6.1) 2018-08-14 19:19:14.348404: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1084] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:1 with 10410 MB memory) -> physical GPU (device: 1, name: GeForce GTX 1080 Ti, pci bus id: 0000:65:00.0, compute capability: 6.1) Traceback (most recent call last): File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 1322, in _do_call return fn(*args) File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 1307, in _run_fn options, feed_dict, fetch_list, target_list, run_metadata) File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 1409, in _call_tf_sessionrun run_metadata) tensorflow.python.framework.errors_impl.NotFoundError: Tensor name "BoxPredictor_0/BoxEncodingPredictor/biases" not found in checkpoint files /home/vip/fastData/objd/ssdlite_mobilenet_v2_coco_2018_05_09/model.ckpt-50000.index [[Node: save/RestoreV2 = RestoreV2[dtypes=[DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, ..., DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_INT64], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_save/Const_0_0, save/RestoreV2/tensor_names, save/RestoreV2/shape_and_slices)]] [[Node: save/RestoreV2/_301 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device_incarnation=1, tensor_name="edge_306_save/RestoreV2", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:GPU:0"]()]]
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "object_detection/export_tflite_ssd_graph.py", line 137, in
tf.app.run(main)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/platform/app.py", line 125, in run
_sys.exit(main(argv))
File "object_detection/export_tflite_ssd_graph.py", line 133, in main
FLAGS.max_classes_per_detection)
File "/home/vip/TisanBrain/sample/objd/models/research/object_detection/export_tflite_ssd_graph_lib.py", line 262, in export_tflite_graph
initializer_nodes='')
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/tools/freeze_graph.py", line 104, in freeze_graph_with_def_protos
saver.restore(sess, input_checkpoint)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/training/saver.py", line 1768, in restore
six.reraise(exception_type, exception_value, exception_traceback)
File "/usr/lib/python3/dist-packages/six.py", line 686, in reraise
raise value
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/training/saver.py", line 1752, in restore
{self.saver_def.filename_tensor_name: save_path})
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 900, in run
run_metadata_ptr)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 1135, in _run
feed_dict_tensor, options, run_metadata)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 1316, in _do_run
run_metadata)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 1335, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.NotFoundError: Tensor name "BoxPredictor_0/BoxEncodingPredictor/biases" not found in checkpoint files /home/vip/fastData/objd/ssdlite_mobilenet_v2_coco_2018_05_09/model.ckpt-50000.index
[[Node: save/RestoreV2 = RestoreV2[dtypes=[DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, ..., DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_INT64], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_save/Const_0_0, save/RestoreV2/tensor_names, save/RestoreV2/shape_and_slices)]]
[[Node: save/RestoreV2/_301 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device_incarnation=1, tensor_name="edge_306_save/RestoreV2", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:GPU:0"]()]]
Caused by op 'save/RestoreV2', defined at: File "object_detection/export_tflite_ssd_graph.py", line 137, in
tf.app.run(main)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/platform/app.py", line 125, in run
_sys.exit(main(argv))
File "object_detection/export_tflite_ssd_graph.py", line 133, in main
FLAGS.max_classes_per_detection)
File "/home/vip/TisanBrain/sample/objd/models/research/object_detection/export_tflite_ssd_graph_lib.py", line 248, in export_tflite_graph
saver = tf.train.Saver(**saver_kwargs)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/training/saver.py", line 1284, in init
self.build()
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/training/saver.py", line 1296, in build
self._build(self._filename, build_save=True, build_restore=True)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/training/saver.py", line 1333, in _build
build_save=build_save, build_restore=build_restore)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/training/saver.py", line 781, in _build_internal
restore_sequentially, reshape)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/training/saver.py", line 400, in _AddRestoreOps
restore_sequentially)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/training/saver.py", line 832, in bulk_restore
return io_ops.restore_v2(filename_tensor, names, slices, dtypes)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/ops/gen_io_ops.py", line 1463, in restore_v2
shape_and_slices=shape_and_slices, dtypes=dtypes, name=name)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/framework/op_def_library.py", line 787, in _apply_op_helper
op_def=op_def)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/framework/ops.py", line 3414, in create_op
op_def=op_def)
File "/home/vip/tensorflow/venv/local/lib/python3.5/site-packages/tensorflow/python/framework/ops.py", line 1740, in init
self._traceback = self._graph._extract_stack() # pylint: disable=protected-access
NotFoundError (see above for traceback): Tensor name "BoxPredictor_0/BoxEncodingPredictor/biases" not found in checkpoint files /home/vip/fastData/objd/ssdlite_mobilenet_v2_coco_2018_05_09/model.ckpt-50000.index [[Node: save/RestoreV2 = RestoreV2[dtypes=[DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, ..., DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_FLOAT, DT_INT64], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_save/Const_0_0, save/RestoreV2/tensor_names, save/RestoreV2/shape_and_slices)]] [[Node: save/RestoreV2/_301 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device_incarnation=1, tensor_name="edge_306_save/RestoreV2", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:GPU:0"]()]]