Megvii-BaseDetection / cvpods

All-in-one Toolbox for Computer Vision Research.
https://cvpods.readthedocs.io
Apache License 2.0
645 stars 80 forks source link

about inference time #86

Open shenhaibb opened 2 years ago

shenhaibb commented 2 years ago

where is the inference time in log.txt 2022-04-29 13:20:41.838 | INFO | cvpods.utils.dump.events:write:253 - eta: 0:00:00 iter: 90000/90000 total_loss: 0.106 loss_cls: 0.001 loss_box_reg: 0.102 num_fg_per_gt: 1.000 time: 0.1519 data_time: 0.0010 lr: 0.000006 max_mem: 1487M

time: 0.1519? 0.1519s?151.9ms?

FateScript commented 2 years ago

0.15s

shenhaibb commented 2 years ago

2022-04-29 13:20:12.044 | INFO | cvpods.evaluation.evaluator:inference_on_dataset:112 - Start inference on 399 data samples 2022-04-29 13:20:13.201 | INFO | cvpods.evaluation.evaluator:inference_on_dataset:142 - Inference done 12/399. 0.0739 s / sample. ETA=0:00:28 2022-04-29 13:20:18.268 | INFO | cvpods.evaluation.evaluator:inference_on_dataset:142 - Inference done 80/399. 0.0742 s / sample. ETA=0:00:23 2022-04-29 13:20:23.279 | INFO | cvpods.evaluation.evaluator:inference_on_dataset:142 - Inference done 148/399. 0.0739 s / sample. ETA=0:00:18 2022-04-29 13:20:28.332 | INFO | cvpods.evaluation.evaluator:inference_on_dataset:142 - Inference done 217/399. 0.0736 s / sample. ETA=0:00:13 2022-04-29 13:20:33.359 | INFO | cvpods.evaluation.evaluator:inference_on_dataset:142 - Inference done 286/399. 0.0733 s / sample. ETA=0:00:08 2022-04-29 13:20:38.411 | INFO | cvpods.evaluation.evaluator:inference_on_dataset:142 - Inference done 355/399. 0.0733 s / sample. ETA=0:00:03 2022-04-29 13:20:41.711 | INFO | cvpods.evaluation.evaluator:inference_on_dataset:154 - Total inference time: 0:00:29.028718 (0.073677 s / sample per device, on 1 devices) 2022-04-29 13:20:41.712 | INFO | cvpods.evaluation.evaluator:inference_on_dataset:160 - Total inference pure compute time: 0:00:28 (0.073300 s / sample per device, on 1 devices) 2022-04-29 13:20:41.722 | INFO | cvpods.evaluation.coco_evaluation:_eval_predictions:203 - Preparing results for COCO format ... 2022-04-29 13:20:41.723 | INFO | cvpods.evaluation.coco_evaluation:_eval_predictions:222 - Saving results to ../outputs/model_logs/cvpods_playground/detection/coco/defcn/poto.res50.fpn.coco.800size.3x_ms/inference/coco_instances_results.json 2022-04-29 13:20:41.735 | INFO | cvpods.evaluation.coco_evaluation:_eval_predictions:231 - Evaluating predictions ...

2022-04-29 13:20:41.836 | INFO | cvpods.engine.runner:test:401 - Evaluation results for coco_2017_val in csv format: 2022-04-29 13:20:41.836 | INFO | cvpods.evaluation.testing:print_csv_format:26 - copypaste: Task: bbox 2022-04-29 13:20:41.836 | INFO | cvpods.evaluation.testing:print_csv_format:27 - copypaste: AP,AP50,AP75,APs,APm,APl 2022-04-29 13:20:41.836 | INFO | cvpods.evaluation.testing:print_csv_format:28 - copypaste: 46.9116,72.2490,52.5982,36.8020,43.9310,48.6259 2022-04-29 13:20:41.838 | INFO | cvpods.utils.dump.events:write:253 - eta: 0:00:00 iter: 90000/90000 total_loss: 0.106 loss_cls: 0.001 loss_box_reg: 0.102 num_fg_per_gt: 1.000 time: 0.1519 data_time: 0.0010 lr: 0.000006 max_mem: 1487M 2022-04-29 13:20:41.953 | INFO | cvpods.engine.hooks:after_train:200 - Overall training speed: 89998 iterations in 3:47:47 (0.1519 s / it) 2022-04-29 13:20:41.954 | INFO | cvpods.engine.hooks:after_train:208 - Total training time: 4:02:57 (0:15:09 on hooks)


i found 'Total inference time: 0:00:29.028718 (0.073677 s / sample per device, on 1 devices)' inference time=0.073677 s?

FateScript commented 2 years ago

0.0736 second per sample per device.

shenhaibb commented 2 years ago

0.0736 second per sample per device.

yeah, i have only one gpu then my inference time is 0.1519s or 0.0736s