davisking / dlib

A toolkit for making real world machine learning and data analysis applications in C++
http://dlib.net
Boost Software License 1.0
13.59k stars 3.38k forks source link

High CPU usage when I use GPU #1447

Closed oxozle closed 6 years ago

oxozle commented 6 years ago

When I use GPU to detect faces with cnn_face_detection_model_v1 (mmod_human_face_detector.dat) I have 100% single core usage. strace util shows tons of

clock_gettime(CLOCK_MONOTONIC_RAW, {2717, 887535131}) = 0
clock_gettime(CLOCK_MONOTONIC_RAW, {2717, 887549546}) = 0
clock_gettime(CLOCK_MONOTONIC_RAW, {2717, 887563930}) = 0
clock_gettime(CLOCK_MONOTONIC_RAW, {2717, 887578327}) = 0
clock_gettime(CLOCK_MONOTONIC_RAW, {2717, 887592756}) = 0
clock_gettime(CLOCK_MONOTONIC_RAW, {2717, 887607184}) = 0

top utils shows ~57us and ~25sy.

Expected Behavior

GPU detection should not use GPU for 100%

Current Behavior

Detection 1000 images takes 10sec (it's ok) but CPU usage is 100%

Steps to Reproduce

detector = dlib.cnn_face_detection_model_v1('weights/mmod_human_face_detector.dat')
 for file in files:
        img = cv2.imread(file, cv2.IMREAD_COLOR)
        detector(img, 1)

Ubuntu 16.04 python3.6 dlib 19.15.0 GeForce GTX 1060 6GB

davisking commented 6 years ago

That’s how it works. It’s not a bug.