matterport / Mask_RCNN

Mask R-CNN for object detection and instance segmentation on Keras and TensorFlow
Other
24.62k stars 11.7k forks source link

Decreasing inference time #758

Open msieb1 opened 6 years ago

msieb1 commented 6 years ago

What are the hyperparameters to look at that have the most influence on decreasing inference time without affecting detection performance too drastically?

jlognn commented 6 years ago

Try using a resnet50 backbone instead of resnet101

Other than that, similarly to training, inference is quite hardware dependant, hence the reason that Nvidia are building specific cards for inference such as the Tesla P4 and the Jetson module. If you are running an AWS/Google Cloud instance or something like that for inference, try increasing core count and make sure that Keras is using multiprocessing.