laugh12321 / TensorRT-YOLO

🚀 你的YOLO部署神器。TensorRT Plugin、CUDA Kernel、CUDA Graphs三管齐下,享受闪电般的推理速度。| Your YOLO Deployment Powerhouse. With the synergy of TensorRT Plugins, CUDA Kernels, and CUDA Graphs, experience lightning-fast inference speeds.
https://github.com/laugh12321/TensorRT-YOLO
GNU General Public License v3.0
540 stars 67 forks source link

[Bug]: Inference Precision Anomaly on Linux Environment #5

Closed laugh12321 closed 7 months ago

laugh12321 commented 7 months ago

I am encountering a precision anomaly during inference on a Linux environment. The inference results deviate from the expected values, suggesting a potential issue with numerical precision or computation.

Linux

466d129706a2f215979145de11de471

Windows

0e515f60c780d178868f6a608667233

laugh12321 commented 7 months ago

Inference Anomaly Caused by preprocess.cu on Linux