takuya-takeuchi / FaceRecognitionDotNet

The world's simplest facial recognition api for .NET on Windows, MacOS and Linux
MIT License
1.27k stars 308 forks source link

FaceEncoding changes between machines #155

Closed ADD-eNavarro closed 3 years ago

ADD-eNavarro commented 3 years ago

In the test phase of our system, we have already saved known faces FaceEncodings, and proceed to create new FaceEncodings in different devices (other computers, but also phones or tablets) -using the same picture- to check that it gets recognized well (distance between saved and new should be 0). But we find that, inside the double[] of both encodings, the first 7 decimal numbers are the same but then it differs. Example: thumbnail_image001

We get the coding using FaceEncoding.GetRawEncoding(). We are using the MKL packet. Any idea why this is happening? Do you think this will have a noticeable effect on face recognition results? Also, and this I will test by myself asap, do you think using non-MKL, or CUDA version will make the doubles change too?

takuya-takeuchi commented 3 years ago

I'm not sure actual os and machine you use. But your issue is not weird. Linux and OSX FRDN depend on blas but Win FRDN does not. This difference of compute result occurs also on deep learning world.

BTW I did not test on 2 machine which has same windows os and cpu. As a test, I extracted face encoding from same image by using different package and different machines.

Intel cpu on OSX

CPU

l: 190, t: 32, r: 252, b: 94
-0.0858780965209007, 0.108834989368916, 0.0365723818540573, -0.0589280351996422, 0.0132484585046768, -0.00542666018009186, -0.0654468163847923

MKL

l: 190, t: 32, r: 252, b: 94
-0.0858780890703201, 0.108834981918335, 0.0365723744034767, -0.0589280650019646, 0.0132484287023544, -0.00542664900422096, -0.0654467791318893

Intel cpu on Windows

CPU

l: 190, t: 32, r: 252, b: 94
-0.0858781188726425, 0.108835116028786, 0.0365724340081215, -0.0589281246066093, 0.0132484100759029, -0.00542659685015678, -0.065446712076664

MKL

l: 190, t: 32, r: 252, b: 94
-0.0858781188726425, 0.108835056424141, 0.0365723744034767, -0.0589279383420944, 0.0132484287023544, -0.00542665924876928, -0.0654468238353729
ADD-eNavarro commented 3 years ago

As usual, fast answer and very satisfying. Thank you very much, Mr. @takuya-takeuchi