Closed pliablepixels closed 5 years ago
closing - I think this is more appropriate to be in the dlib repo
@pliablepixels did you find your answer?
I did. a) It was a goofy mistake on my side https://github.com/davisking/dlib/issues/1805 b) (As a side note, I also realized that just compiling dlib to have GPU support doesn't mean it will be used. The GPU drivers need to be correct)
@pliablepixels thanks for the reply, I guess that I'm with the same problem(versions between stuff) I'm using GeForce 920m(4GB) with drive 418.67, cuda 10.1 and cudnn 7.6.1 on Ubuntu 18.04 can you say something about this stack?
My context is: I'm trying to improve fps for an RTSP video stream face recognition
I can say a lot given I went through the complete process of getting everything working :-D I've created a gist to help you through - not sure what your issue is though, the fact that dlib is not using GPU or whether GPU is not configured correctly. Either way, if you need more help, feel free to continue comments on the gist. https://gist.github.com/pliablepixels/f0f86d8f8a8d2ddcbe9b9d4e25088af4
@pliablepixels thank you so much, I'll try and will return to say if worked fine
No problem. FYI after all these changes, an 800x600 face takes around 240ms (average) in CNN mode. (first load time not included).
@murilolopes For rtsp use this approach
Description
I have an NVIDIA 1050 Ti GPU installed on my machine.
Drivers are correct:
I have DLIB compiled with GPU:
However, when I run face_recognition examples, this is what I see:
A typical detection time on my machine (Xeon 3.1GHZ, 32GB RAM) takes 3 seconds. The output of nvidia-smi seems to show its using "C" (CPU?) and not GPU?
Please let me know if you need any more info.