Ghustwb / MobileNet-SSD-TensorRT

Accelerate mobileNet-ssd with tensorRT
188 stars 91 forks source link

TensorRt's result changes erery time #23

Open arleyzhang opened 5 years ago

arleyzhang commented 5 years ago

@Ghustwb I found your question at NVIDIA's forum, here: https://devtalk.nvidia.com/default/topic/1043279/same-tensorrt-code-get-different-result/?offset=10#5307682 I refer to the same github code as you mentioned to build SSD tensorRT version https://github.com/saikumarGadde/tensorrt-ssd-easy And now I came cross the same problem as you raised at nvidia's forum:

  1. the result will change every time i run the inference, although the variation is slight, the result is below the results are defferent between two detectons about image-71.
  2. there is big difference between the 1080ti and jetson tx2, the confidence about image-71 is much lower, I don't know why.

I tried your code, although your implementation is mobilenet-SSD, seems you have solved this problem? I run many times at 1080ti and they have the same coordinates value, and the result at jetson tx2 has slightly differenence, the results are close to each other.

Could you give me some advices? I would be very grateful.