Closed zmylk closed 3 years ago
You need TensorFlow binaries that were compiled with TensorRT to use TF-TRT. For example, you can use the docker images from the NGC repository: https://catalog.ngc.nvidia.com/orgs/nvidia/containers/tensorflow
You need TensorFlow binaries that were compiled with TensorRT to use TF-TRT. For example, you can use the docker images from the NGC repository: https://catalog.ngc.nvidia.com/orgs/nvidia/containers/tensorflow
there is no way to install and use TensorRT without use docker images from NGC ? I have Installed tensorflow-gpu that compatible with my python version (3.8) and everything work well but with TensorRT when i want to convert my model to TensorRT I get this error:
ERROR:tensorflow:Tensorflow needs to be built with TensorRT support enabled to allow TF-TRT to operate.
Traceback (most recent call last):
File "ModelConversion.py", line 29, in <module>
converter = trt.TrtGraphConverterV2(
File "/home/ml2/Projects/Kabodian/InstallPrograms/env/lib/python3.8/site-packages/tensorflow/python/util/deprecation.py", line 561, in new_func
return func(*args, **kwargs)
File "/home/ml2/Projects/Kabodian/InstallPrograms/env/lib/python3.8/site-packages/tensorflow/python/compiler/tensorrt/trt_convert.py", line 1219, in __init__
_check_trt_version_compatibility()
File "/home/ml2/Projects/Kabodian/InstallPrograms/env/lib/python3.8/site-packages/tensorflow/python/compiler/tensorrt/trt_convert.py", line 224, in _check_trt_version_compatibility
raise RuntimeError("Tensorflow has not been built with TensorRT support.")
RuntimeError: Tensorflow has not been built with TensorRT support.
Is this error solved without using the docker image?
cmd:python tf2_inference.py --use_tftrt_model --precision int8 ERROR:tensorflow:Tensorflow needs to be built with TensorRT support enabled to allow TF-TRT to operate. Has anyone encountered a similar error?