Closed memoiry closed 4 years ago
The total time shown during evaluation includes several additional operations that are not relevant to the network inference timing. Some of these include:
Yes, Jason, a timing chart could be very helpful if it includes both preprocessing time && inference time. 1) I personally profile the preprocessing time which is almost 0.1s, much larger than the time 0.02s in the paper 2) In another opened issue, it seems you personally verified the "create bev map" time is already 0.02s, how could the preprocessing time being 0.02s ?
Thanks for your response!
Yes, that's true, I recount the inference time, which is around 0.12s
and quite impressive.
@kujason @memoiry @yzhou-saic , Can i run this model with CPU machine which has 16GB RAM??? Only testing on pre-trained model??
You will need a computer with a GPU, I would not recommend trying to run this on a CPU only.
Thanks for sharing your code!
I have followed the procedure in the README and trained the model after 53000 step, So I made an experiment using the evaluator, below is the output
I think the inference time(around 0.3s) is slow compared with 100ms claimed in the paper, any suggestion? I'm using a 1080Ti GPU.