ageitgey / face_recognition

The world's simplest facial recognition api for Python and the command line
MIT License
52.91k stars 13.44k forks source link

How to set face_encodings to use CPU and RAM only? #865

Open michaeldengxyz opened 5 years ago

michaeldengxyz commented 5 years ago

Description

when I ran to get face_locations with mode='hog', no problem. When i ran to get face_encodings, got this error below, it seems that this function of face_encodings is going to use GPU to get allocate memory. How to set it to use CPU anbd RAM only?

What I Did

faces_locations = face_recognition.face_locations(rgb_small_frame,number_of_times_to_upsample=1, model='hog') face_encodings = face_recognition.face_encodings(rgb_small_frame, faces_locations)

Traceback (most recent call last): File "h:/WE4/_git/Photo-Manager.py", line 979, in FacesCropFromImage face_encodings = face_recognition.face_encodings(curframe, faces) File "C:\ProgramData\Anaconda3\envs\tensorflow_gpu\lib\site-packages\face_recognition\api.py", line 210, in face_encodings return [np.array(face_encoder.compute_face_descriptor(face_image, raw_landmark_set, num_jitters)) for raw_landmark_set in raw_landmarks] File "C:\ProgramData\Anaconda3\envs\tensorflow_gpu\lib\site-packages\face_recognition\api.py", line 210, in return [np.array(face_encoder.compute_face_descriptor(face_image, raw_landmark_set, num_jitters)) for raw_landmark_set in raw_landmarks] RuntimeError: Error while calling cudaMalloc(&data, new_size*sizeof(float)) in file I:\dlib-19.17\dlib\cuda\gpu_data.cpp:218. code: 2, reason: out of memory

michaeldengxyz commented 5 years ago

my RAM, CPU, GPU usage when i got that error: ... RAM usage: 56.8% 9279M/16349M; CPU usage: 49.8% ... GPU#0: total memory 2048 M, used 1978 M 96.58 %, free 69 M

ageitgey commented 5 years ago

You could compile/install dlib without CUDA support. Then it would never use your GPU. I'm not sure one Windows how to disable it on-the-fly.

michaeldengxyz commented 5 years ago

You could compile/install dlib without CUDA support. Then it would never use your GPU. I'm not sure one Windows how to disable it on-the-fly.

I'll try later, thanks a lot!