kujason / avod

Code for 3D object detection for autonomous driving
MIT License
939 stars 347 forks source link

Inference Time #15

Closed memoiry closed 4 years ago

memoiry commented 6 years ago

Thanks for sharing your code!

I have followed the procedure in the README and trained the model after 53000 step, So I made an experiment using the evaluator, below is the output

Step 53000: 450 / 3769, Inference on sample 001021
Step 53000: Eval RPN Loss: objectness 0.149, regression 0.095, total 0.244
Step 53000: Eval AVOD Loss: classification 0.038, regression 1.809, total 2.091
Step 53000: Eval AVOD Loss: localization 1.310, orientation 0.499
Step 53000: RPN Objectness Accuracy: 0.95703125
Step 53000: AVOD Classification Accuracy: 0.9880478087649402
Step 53000: Total time 0.577916145324707 s
Step 53000: 451 / 3769, Inference on sample 001022
Step 53000: Eval RPN Loss: objectness 0.026, regression 0.094, total 0.119
Step 53000: Eval AVOD Loss: classification 0.019, regression 0.942, total 1.080
Step 53000: Eval AVOD Loss: localization 0.897, orientation 0.045
Step 53000: RPN Objectness Accuracy: 0.9921875
Step 53000: AVOD Classification Accuracy: 0.9970443349753695
Step 53000: Total time 0.24765753746032715 s
Step 53000: 452 / 3769, Inference on sample 001025
Step 53000: Eval RPN Loss: objectness 0.172, regression 0.175, total 0.347
Step 53000: Eval AVOD Loss: classification 0.044, regression 3.892, total 4.282
Step 53000: Eval AVOD Loss: localization 3.537, orientation 0.354
Step 53000: RPN Objectness Accuracy: 0.970703125
Step 53000: AVOD Classification Accuracy: 0.9950884086444007
Step 53000: Total time 0.2989237308502197 s
Step 53000: 453 / 3769, Inference on sample 001026

I think the inference time(around 0.3s) is slow compared with 100ms claimed in the paper, any suggestion? I'm using a 1080Ti GPU.

kujason commented 6 years ago

The total time shown during evaluation includes several additional operations that are not relevant to the network inference timing. Some of these include:

yzhou-saic commented 6 years ago

Yes, Jason, a timing chart could be very helpful if it includes both preprocessing time && inference time. 1) I personally profile the preprocessing time which is almost 0.1s, much larger than the time 0.02s in the paper 2) In another opened issue, it seems you personally verified the "create bev map" time is already 0.02s, how could the preprocessing time being 0.02s ?

memoiry commented 6 years ago

Thanks for your response!

Yes, that's true, I recount the inference time, which is around 0.12s and quite impressive.

sainisanjay commented 5 years ago

@kujason @memoiry @yzhou-saic , Can i run this model with CPU machine which has 16GB RAM??? Only testing on pre-trained model??

kujason commented 4 years ago

You will need a computer with a GPU, I would not recommend trying to run this on a CPU only.