NVIDIA-AI-IOT / cuDLA-samples

YOLOv5 on Orin DLA
Other
185 stars 18 forks source link

How to measure inference time for cudla standalone mode? #26

Closed Railcalibur closed 2 months ago

Railcalibur commented 9 months ago

The code only measures inference time for htbrid mode.

Can I get the correct inference time for standalone mode by commenting the conditional statement ? If not, how to measure the time correctly?

https://github.com/NVIDIA-AI-IOT/cuDLA-samples/blob/main/src/yolov5.cpp#L261

image

lynettez commented 8 months ago

We only recommend to measure the DLA task execution time using Nsight Systems.

lynettez commented 2 months ago

closing since no activity for several months, thanks!