NVIDIA-AI-IOT / tf_trt_models

TensorFlow models accelerated with NVIDIA TensorRT
BSD 3-Clause "New" or "Revised" License
684 stars 244 forks source link

TensorRT is not enabled! HELP ME!!! #45

Open EEECGWood opened 5 years ago

EEECGWood commented 5 years ago

When I use tensorRT to optimize my tensorflow graph , (trt_graph = trt.create_inference_graph( ... input_graph_def=frozen_graph, ... outputs=output_names, ... max_batch_size=1, ... max_workspace_size_bytes=1 << 25, ... precision_mode='FP16', ... minimum_segment_size=50 ... )) I got errors like this: INFO:tensorflow:Running against TensorRT version 0.0.0 Traceback (most recent call last): File "", line 7, in File "/root/anaconda3/lib/python3.6/site-packages/tensorflow/contrib/tensorrt/python/trt_convert.py", line 153, in create_inference_graph int(msg[0])) tensorflow.python.framework.errors_impl.FailedPreconditionError: TensorRT is not enabled!

I have found many methods to solve this problem, however, no one seems useful, can you help me ? I really need to solve this problem as soon as possible !

isra60 commented 5 years ago

Which versión of tensorflow, cuda and tensort do you have installed?

EEECGWood commented 5 years ago

Which versión of tensorflow, cuda and tensort do you have installed?

I use tensorflow 1.10.0 cuda 9 tensorrt 5.0.2.6 cudnn 7.3.1

isra60 commented 5 years ago

Which is your hardware? Jetson Tx2?? If tha is the case you have a very updated version of TensorRT. If you are on Jetpack 3.3 you should stay in TensorRT4