tensorflow / tensorrt

TensorFlow/TensorRT integration
Apache License 2.0
737 stars 225 forks source link

[Jetson] No OpKernel was registered to support Op 'TRTEngineOp' #326

Open Pacifist-99 opened 1 year ago

Pacifist-99 commented 1 year ago

I have been getting this error while running inference on TFTRT converted models (centernet_hg104_512x512_coco17_tpu-8, yolov4-tiny TF converted model). on both models I got the same error Platform : Jetson TX2 OS : ubuntu 20.01 python: 3.6.9 Tensorflow: 2.3.0

Can you please help to find a solution for this error

DEKHTIARJonathan commented 1 year ago

It looks like you are using a TensorFlow version that was compiled without TensorRT and TF-TRT support. Where did you download tensorflow from ?

DEKHTIARJonathan commented 1 year ago

@MattConley CC

Pacifist-99 commented 1 year ago

It looks like you are using a TensorFlow version that was compiled without TensorRT and TF-TRT support. Where did you download tensorflow from ?

I am using TF version 2.3 which was installed using method in this link https://developer.download.nvidia.com/compute/redist/jp/v44 tensorflow==2.3.0+nv20.09

Pacifist-99 commented 1 year ago

It looks like you are using a TensorFlow version that was compiled without TensorRT and TF-TRT support. Where did you download tensorflow from ?

if you know Can you tell where should I download tensorflow 2.3.0 compiled with TRT

Pacifist-99 commented 1 year ago

It looks like you are using a TensorFlow version that was compiled without TensorRT and TF-TRT support. Where did you download tensorflow from ?

I have now compiled TensorRT with tensorflow but the issue percists Is there any way to check if the compilation part done is correct or not?

DEKHTIARJonathan commented 1 year ago

@MattConley to help