Open annhienktuit opened 4 years ago
https://youtu.be/sz25xxF_AVE check out this youtube video then add the big dataset of image of celebrity in that training images folder
I've done this work for you, have an eye here: https://github.com/alessiosavi/PyRecognizer
i currently need to analyse 1000 faces. I mean i have taken 1000 face encodings and their name and saved in pickle file. When i ran the face compare ,the match is showing wrong predictions. Is there any way to overcome this? I am currently implementing 3 images for single celebrity. Any solution ?
if u r using face_recognition library u can train the images num_jitters – How many times to re-sample the face when calculating encoding. Higher is more accurate, but slower (i.e. 100 is 100x slower)
and these parameters make the training more complex and more accurate you can also apply --tolerance 0.54 which more decreased more the accuracy but the thing is while doing it with 1000 images it works slow so slow I have done it with 13k people faces and tolerance less than 0.6 gives frame rate lower than 30fps that's y while training only I trained with high no of jitters
Thanks @silexxx ! I will definitely work with those inputs.