Closed oxozle closed 6 years ago
When I use GPU to detect faces with cnn_face_detection_model_v1 (mmod_human_face_detector.dat) I have 100% single core usage. strace util shows tons of
clock_gettime(CLOCK_MONOTONIC_RAW, {2717, 887535131}) = 0 clock_gettime(CLOCK_MONOTONIC_RAW, {2717, 887549546}) = 0 clock_gettime(CLOCK_MONOTONIC_RAW, {2717, 887563930}) = 0 clock_gettime(CLOCK_MONOTONIC_RAW, {2717, 887578327}) = 0 clock_gettime(CLOCK_MONOTONIC_RAW, {2717, 887592756}) = 0 clock_gettime(CLOCK_MONOTONIC_RAW, {2717, 887607184}) = 0
top utils shows ~57us and ~25sy.
GPU detection should not use GPU for 100%
Detection 1000 images takes 10sec (it's ok) but CPU usage is 100%
detector = dlib.cnn_face_detection_model_v1('weights/mmod_human_face_detector.dat') for file in files: img = cv2.imread(file, cv2.IMREAD_COLOR) detector(img, 1)
Ubuntu 16.04 python3.6 dlib 19.15.0 GeForce GTX 1060 6GB
That’s how it works. It’s not a bug.
When I use GPU to detect faces with cnn_face_detection_model_v1 (mmod_human_face_detector.dat) I have 100% single core usage. strace util shows tons of
top utils shows ~57us and ~25sy.
Expected Behavior
GPU detection should not use GPU for 100%
Current Behavior
Detection 1000 images takes 10sec (it's ok) but CPU usage is 100%
Steps to Reproduce
Ubuntu 16.04 python3.6 dlib 19.15.0 GeForce GTX 1060 6GB