This is a simple demonstration for running Keras model model on Tensorflow with TensorRT integration(TFTRT) or on TensorRT directly without invoking "freeze_graph.py".
The engine creation does not work in Tensor RT 5 due to different commands.
Some of the errors:
Using TensorFlow backend. Traceback (most recent call last): File "trt_exampleOG.py", line 9, in <module> import tensorrt.parsers.uffparser as uffparser ImportError: No module named 'tensorrt.parsers'
DEBUG: convert reshape to flatten node No. nodes: 24 Traceback (most recent call last): File "trt_exampleOG.py", line 125, in <module> main() File "trt_exampleOG.py", line 114, in main engine = TrtEngine(model, 1000) File "trt_exampleOG.py", line 39, in __init__ G_LOGGER = trt.infer.ConsoleLogger(trt.infer.LogSeverity.ERROR) AttributeError: module 'tensorrt' has no attribute 'infer'
The engine creation does not work in Tensor RT 5 due to different commands.
Some of the errors:
Using TensorFlow backend. Traceback (most recent call last): File "trt_exampleOG.py", line 9, in <module> import tensorrt.parsers.uffparser as uffparser ImportError: No module named 'tensorrt.parsers'
DEBUG: convert reshape to flatten node No. nodes: 24 Traceback (most recent call last): File "trt_exampleOG.py", line 125, in <module> main() File "trt_exampleOG.py", line 114, in main engine = TrtEngine(model, 1000) File "trt_exampleOG.py", line 39, in __init__ G_LOGGER = trt.infer.ConsoleLogger(trt.infer.LogSeverity.ERROR) AttributeError: module 'tensorrt' has no attribute 'infer'