Open bigsea00001 opened 6 years ago
Check if dlib is using GPU for acceleration.
The other part uses the GPU. For example, training. so it is fast.
This seems to be the following. ==> https://github.com/shaoanlu/faceswap-GAN/issues/1
I would like to use GAN more than other faceswap program. Because I am not a professional programmer, I did a lot of Googling, but I could not fix it.
I don't know how to check if dlib is using GPU for acceleration. and I don't know what to do if I am not using GPU for acceleration.
Can you help me?
In def process_video(input_img):
, remove the cnn argument, i.e., change
faces = face_recognition.face_locations(image, model="cnn")
to
faces = face_recognition.face_locations(image)
Thus the API will use default Haar features for face detection. You can check if this results reasonable speed. If not, maybe it has something to do with writing the video clip, not with face detection.
Image : https://imgur.com/Y2EOddR
I'm not sure if such speed is ok because it depends on your hardware and input video resolution. But the 100x speed difference between default Haar and non-GPU accelerated CNN is reasonable. So it seems like face_recognition
is not using your GPU.
FYI, I had 5iter/s using cnn face detection on 600x600 input resolution.
many thanks! How do I get face_recognition to use my GPU?
Take a look at this issue in face_recognition
. I haven't encounter this problem myself, can't be much of help.
Thank you for your kind reply. Once this issue is resolved, I will tell you the outcome.
+1 to this. by removing part [model="cnn"], speed is okay now. (about 5min / 10 sec input video) (I'm using GTX770 + cudnn64_6.dll)
The mp4 making speed seems to be too slow.
Below (1) is my device type and (2) is the speed image that made the video. I would appreciate it if you could tell me what I need to do.
(1)https://imgur.com/P0mMDlC (2)https://imgur.com/tqd4lyP