Open rain2008204 opened 6 years ago
@rain2008204 Even I have the same issue. @davidsandberg Pl help in solving this issue. If the number of classes increases, the face recognition in not that robust and often gives false positive. How to go about solving this issue, any idea?
Softmax can be used for classification training, it takes longer time while training, but it is faster at inferring.
How about using SVM classifier and LinearSVC()
?
Dear All, Maybe you can try dbscan?
https://github.com/davidsandberg/facenet/blob/master/contributed/cluster.py
Ideally, 20k people should result in 20k clusters.
But reality here is far from ideal orz
Btw, I just discovered that cluster.py is not committed until : Mon Aug 28 13:11:55 2017 +0200
commit e5e0393ee4d496cf38cf182f5afa106bf3071d88 Author: Maarten Bloemen maarten.bloemen@student.pxl.be Date: Mon Aug 28 13:11:55 2017 +0200
BR, JimmyYS
Hi! Does anyone know how to calculate the properly needed size of GPU VRAM? Let's say if I use the pretrained VGG and run the classifier on 1000 people (100 faces/person), how much VRAM is recommended? Is a single RTX 2080 Ti enough?
who can help me to solve this ,I have no idea to recognize lager amount of people?