ageitgey / face_recognition

The world's simplest facial recognition api for Python and the command line
MIT License
52.84k stars 13.43k forks source link

How can I implement this library with a big dataset of image of celebrity? #1180

Open annhienktuit opened 4 years ago

annhienktuit commented 4 years ago
silexxx commented 4 years ago

https://youtu.be/sz25xxF_AVE check out this youtube video then add the big dataset of image of celebrity in that training images folder

alessiosavi commented 4 years ago

I've done this work for you, have an eye here: https://github.com/alessiosavi/PyRecognizer

aravindseng commented 4 years ago

i currently need to analyse 1000 faces. I mean i have taken 1000 face encodings and their name and saved in pickle file. When i ran the face compare ,the match is showing wrong predictions. Is there any way to overcome this? I am currently implementing 3 images for single celebrity. Any solution ?

silexxx commented 4 years ago

if u r using face_recognition library u can train the images num_jitters – How many times to re-sample the face when calculating encoding. Higher is more accurate, but slower (i.e. 100 is 100x slower)

model – Optional - which model to use. “large” (default) or “small” which only returns 5 points but is faster.

and these parameters make the training more complex and more accurate you can also apply --tolerance 0.54 which more decreased more the accuracy but the thing is while doing it with 1000 images it works slow so slow I have done it with 13k people faces and tolerance less than 0.6 gives frame rate lower than 30fps that's y while training only I trained with high no of jitters

aravindseng commented 4 years ago

Thanks @silexxx ! I will definitely work with those inputs.