NVIDIA-AI-IOT / cuDLA-samples

YOLOv5 on Orin DLA
Other
180 stars 17 forks source link

How to measure inference time for cudla standalone mode? #26

Closed Railcalibur closed 1 month ago

Railcalibur commented 7 months ago

The code only measures inference time for htbrid mode.

Can I get the correct inference time for standalone mode by commenting the conditional statement ? If not, how to measure the time correctly?

https://github.com/NVIDIA-AI-IOT/cuDLA-samples/blob/main/src/yolov5.cpp#L261

image

lynettez commented 6 months ago

We only recommend to measure the DLA task execution time using Nsight Systems.

lynettez commented 1 month ago

closing since no activity for several months, thanks!