Closed ftrentini closed 2 years ago
Depending on the GPU you use, there may be a utility available to observe GPU usage. I'm interest to know what you did to try using your GPU
My dedicated server has a GTX1060 plugged on a Xeon E5-2520, with 16GB of RAM. Connected by gigabit ethernet to the webserver. And my doubt about it is just because the gpustat command showed me the usage was only about 1-2%.
So you're just using python dlib without any changes? It's just a set of bindings to the dlib library, and if the library doesn't support your GPU, (or isn't built with cuda/whatever support), you aren't going to get GPU. Try this:
$ python
Python 3.10.2 (main, Jan 17 2022, 00:00:00) [GCC 11.2.1 20211203 (Red Hat 11.2.1-7)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import dlib
>>> dlib.DLIB_USE_CUDA
False
If it doesn't return True (you see mine reports False even though I have a GTX 980), you don't have support for it in your dlib library.
Hmmmm, enlightenment!
`
Solved. STEP BY STEP:
After that, I:
And violà:
Python 3.8.10 (default, Nov 26 2021, 20:14:08)
[GCC 9.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import dlib
>>> dlib.DLIB_USE_CUDA
True
>>>
and PHP
Tested. Working like a charm! Now I can try to process all my 240k+ photos!! :smile:
Thanks a lot @guystreeter and @matiasdelellis
Hi, I know this is not the channel to send this kind of message, but I tried to use GPU to process externally and I really don't know if it's working. My server is processor only, but is processing faster than the dedicated external-model server GPU powered I built just to face recognition. Is there a way to really know if the model is using GPU?