stark-t / PAI

Pollination_Artificial_Intelligence
5 stars 1 forks source link

Implement script for inference test speed on a usual CPU #56

Closed valentinitnelav closed 1 year ago

valentinitnelav commented 1 year ago

I'll use the test dataset and get the inference speed for a single CPU and GPU as well (GPU because some microcomputers like NVIDIA Jetson Nano have a nano GPU and is good to have that as a reference as well).

Test inference speed for YOLOv5 n & s and YOLOv7 tiny.

stark-t commented 1 year ago

@valentinitnelav should we do just 10 tests for the best possibe threshold combination of 0.2 confidence and 0.9 iou and get the mean inference time or calculate the mean of all 81 threshold combinations?

valentinitnelav commented 1 year ago

I was thinking of something more simple - do the speed test using the optimal conf and IoU values that we get from the previous tests regarding model performance metrics. The time to run these things on a CPU might take a lot and also might not differ that much on the conf and IoU values.

valentinitnelav commented 1 year ago

See #61 now