Open MarvinKlemp opened 1 year ago
It is because inference_detector()
composes the test pipeline everytime it is called. You simply need to call model.test_step()
. See https://github.com/open-mmlab/mmdetection3d/blob/main/mmdet3d/apis/inference.py
Prerequisite
Task
I'm using the official example scripts/configs for the officially supported tasks/models/datasets.
Branch
main branch https://github.com/open-mmlab/mmdetection3d
Environment
I am using the provided Dockerfile
Reproduces the problem - code sample
detect.py
Reproduces the problem - command or script
using the official weights:
Reproduces the problem - error message
The model is extremely slow ~2.6s per inference
Additional information
I expected inference that is significantly faster.
Any ideas?