deepinsight / insightface

State-of-the-art 2D and 3D Face Analysis Project
https://insightface.ai
23.61k stars 5.44k forks source link

Something Wrong and i got 100% accuracy🥵 #2287

Open CMakey opened 1 year ago

CMakey commented 1 year ago

as a beginner of cv, i trained this model on a tesla P4 sever.

in training, i use umd dataset , and reset the parameter after copying configs.

config = edict()
config.margin_list = (1.0, 0.5, 0.0)
config.network = "r100"
config.resume = False
config.output = None
config.embedding_size = 512
config.sample_rate = 1.0
config.fp16 = True
config.momentum = 0.9
config.weight_decay = 5e-4
config.batch_size = 6
config.lr = 0.1
config.verbose = 2000
config.dali = False
config.rec = "../_datasets_/faces_umd"
config.num_classes = 10575
config.num_image = 494414
config.num_epoch = 5
config.warmup_epoch = 0
config.val_targets = ['lfw', 'cfp_fp', "agedb_30"]

the log file show the result like this:training.log and the model is here: model

when i want to use the model on few pics selected from IJBC, the result came 100%.


Time: 0.00 s. 
Time: 0.00 s. 
files: 300
batch 0
batch 1
batch 2
Time: 9.90 s. 
Feature Shape: (300 , 1024) .
(300, 512) (300,)
Finish Calculating 0 template features.
Time: 0.01 s. 
Finish 0/1 pairs.
Time: 0.00 s. 
+-----------+--------+--------+--------+--------+--------+--------+
|  Methods  | 1e-06  | 1e-05  | 0.0001 | 0.001  |  0.01  |  0.1   |
+-----------+--------+--------+--------+--------+--------+--------+
| ijbc-IJBC | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 |
+-----------+--------+--------+--------+--------+--------+--------+```

i couldn't figure it out , and hope someone had solutions.
CMakey commented 1 year ago

what's more , when i run the code python inference.py --weight work_dirs/face_umd_r100/model.pt --network r100 the result cames like this, there are only 200 numbers in all , but i don't know what to do.

  -0.14197479 -0.90856093  0.2612555   2.5996816  -0.8645787   1.0984787
  -1.5481728   1.1185685   1.1716927  -0.5919134  -0.86590195  0.4793761
   0.20306966  0.31248182 -0.7318979   0.10985775 -0.69324315 -1.2252758
   0.9595894   2.4714372   0.9033031   1.65573     1.5275675   1.9243096
   0.30653664 -0.3621285  -0.22210565 -0.9324069  -3.2201917   0.4331724
   0.21072745  0.7518698   0.40516305 -0.63688517 -0.50916415  0.73849624
   0.39027804  0.34557724  1.4237952   0.52682006 -0.35319164  1.1468838
   1.7897052  -0.08494984 -3.0109212   1.6659174  -1.9667882  -1.2440423
   2.1366186   2.1304927  -0.37914786 -0.7269743   0.8108647   0.9784584
  -1.6757375   1.2260991  -2.3573203   1.7145627   1.0278714  -1.3689977
  -0.69652265 -1.673241   -0.23464157  0.827121    2.085239   -0.13495259
   2.4772122  -1.829716    1.1441706  -1.096763    1.6096206   0.7074399
   0.3498694   0.37551004  0.29052967 -1.5270598   0.6691264  -0.9401472
  -1.4271096   1.0869007  -1.4407881   0.8018797  -1.486573   -0.14876139
   0.39238277  0.5260103   1.1262397   1.4627178  -0.46582985  0.28019136
  -1.1019486   1.5502445  -3.7200494   0.13907395 -0.77018434  1.5063859
  -0.42647538 -2.7784085  -0.51739246  0.1862042   0.79934347  0.74231964
   0.12219442  0.45059732  0.862764   -1.5644227   1.7897822   0.7287775
   1.2232933  -0.46182632 -2.4194262   1.4597034   2.119485    0.4852753
  -0.24113019  0.80735     0.15052532  0.63328546  1.7220626  -2.3073342
   0.51064366 -0.97686666 -0.20707193 -1.166993    0.6556213   0.45225808
   2.4424076   0.53898114  0.6369736   0.09511925  0.9471057  -1.1503799
   0.08223129 -0.5421476  -0.0612626  -1.1479584  -2.5161614   1.2851589
  -1.97923     0.11969204 -0.6341834  -0.3547518   0.8522022  -0.6718146
   2.1875858  -0.2715869   0.68040913 -0.45416573  0.17012958 -1.8134416
  -0.5063244   0.25494152 -0.29034883 -0.4156903  -2.8594334   0.10826248
   0.36027417 -1.2161708  -1.5655534   1.1249104  -0.13304886  0.14267042
   1.188729    0.99348867  1.0221468  -0.57966906 -1.9867332   1.4589988
   1.7214968   0.8131395  -0.17921506 -0.21746238  1.1440798  -0.5321635
  -1.226887   -1.1130831   0.09922308 -0.58225113 -1.8647765   0.58119416
  -0.12439682  0.06047543  0.61621535  1.3933288   0.97782165  1.6939999
  -2.0418587  -0.04077145 -3.745717   -0.9432432  -0.36207813  0.72862476
  -0.78732896  1.5361986   1.8630766   0.4465176   0.19769527  0.63913816
   1.3576422  -0.65597624  2.2250514   1.30703    -0.44994417 -2.2717416
  -0.5714991   1.8127705  -0.86886215  0.03884459  0.20872341 -1.6363363
   0.77186567  1.2802613   3.0081882  -0.9313274   0.9855726   0.47933003
   1.1960621  -1.5032662   1.4641418  -0.63546544 -1.8390664   0.8940129
  -0.37845048 -1.659115    0.6662984  -0.05085506  1.3265948   0.0377506
   0.6227851   1.2222685  -0.59811807 -1.5168611   1.4611329  -3.4309669
   1.7449596  -1.1654227   1.1313052   2.1832542  -0.4607211  -0.47714052
  -1.187059    1.3303441   1.1067159  -1.002934   -0.8986804  -0.10682251
  -1.192637   -0.36964417 -1.0398211   0.7834925   0.24426402 -0.10491683
   0.15511002 -0.705989    0.792656    1.9362661  -1.7136568  -0.36898094
  -2.119776    1.5569715   1.5491294   1.5076641   0.97855747 -1.9341758
  -2.205656    1.9023049   1.3639154   1.7040371   1.1560448   0.19118619
   1.2215725  -1.3706273  -0.3444504   0.70214367  1.3320069  -2.6555367
  -1.636137    1.238437   -3.0184712  -0.38138422 -0.5404247  -0.94531363
   0.04947946  1.5353955  -0.9148663  -0.27081856  0.7918925  -0.71484184
  -0.37999013  1.1978077   0.29071787 -0.36307222 -0.5188514   1.2711834
  -1.516248    0.64300203  0.9850636  -0.23884752  1.305709   -1.0388064
  -0.4388711   1.9157523  -1.6117725  -1.0291632  -1.7321966  -3.1830566
  -1.0535043   1.1002122  -1.5115194  -2.0747805  -1.3357145  -0.1989307
   2.547138   -0.19410777  0.2489086   1.4534051  -1.3778976  -1.3895962
  -3.3970342   0.12530693  1.7656265  -2.2455158  -1.4245323  -0.4269297
   1.9038008   1.0621073  -0.63229996 -0.20192151  0.17258583  1.219417
  -0.1149663   0.6173275   1.8875296  -0.56104684 -1.4668345   0.6517423
   0.7670969   1.3353549   0.41950515  1.3349037   1.3841734  -0.9001517
   0.0048165  -0.59564006 -0.9405832   0.76407194 -0.01644389 -1.0778792
  -0.21250142  0.1299661   0.8748573   1.5036438   2.6045835  -1.4992473
  -1.7407783  -1.4742312  -1.0052722   0.37579325 -0.8352497   0.10070888
  -0.4795149   0.01405213 -0.2968706   1.1805524   2.5484333   3.6290069
   3.5270672  -0.7222721  -1.6135048   1.5211242  -1.6163529   0.84502256
  -0.24393396 -2.5434554  -2.0756862   3.1062286   1.4804682  -3.4912636
   1.2259612  -0.5484494   0.68334603  0.35478243  0.01758031 -0.67679936
  -1.4135786   0.79385054 -2.0789833   1.4616469  -0.5817304  -0.3193594
   0.59265834 -1.8046415  -0.1574746  -0.44488838  1.5178074  -2.6413941
  -0.7170679  -0.7622535   0.47194666 -0.38689828  0.71537524  0.46894506
  -0.9253089  -1.8098103   0.15705091  0.6806365  -0.12298296 -0.8506894
  -1.3320992  -0.09465965  0.8546478  -2.0379236   0.23210374  1.2864611
  -1.115023    1.0626028  -0.8473559  -0.49522603  0.20396842 -1.7851207
   0.07553623  1.7501174   0.07093978  1.2681198   0.06610481  2.231619
   1.926362    1.7156997  -1.2103912  -3.2820034   0.33256155  0.58738434
   0.3856504   0.5730925  -1.6300728  -1.0513625   0.46999264  0.0592112
  -0.14507799 -0.04666986  1.6740463  -0.625391   -0.04774209  0.68075037
  -1.716437    2.0041106   0.35566595  0.39742932  3.878968    1.9351032
  -2.253918    1.6262021   0.38719636  1.764592   -0.99488777 -0.8639892
  -0.85770047  0.7612162  -1.1476153   0.4605851   0.79147536 -1.4109249
  -0.6360634   2.291107    0.26888803 -0.52940243  0.7351876   2.5535805
   0.03425223 -0.33321762 -0.87688434  0.57761836 -1.0517218  -0.5315786
   0.4163581   1.9749796   0.39105406  2.7493799   1.1646098  -0.1528221
  -0.2993352  -1.1022804   1.160775    0.22939645 -0.717629   -0.5306618
   1.3477471  -0.06745018 -0.9459232   0.61537963 -0.68127924  1.3017894
  -1.2358617  -0.58047736  2.354986   -3.6275318   0.50789833  0.5032583
  -0.86339283 -0.9664472  -1.6059818  -0.2954751  -0.06713548  0.45752537
   0.4217528  -1.0154542 ]]
HHCorp commented 1 year ago

as a beginner of cv, i trained this model on a tesla P4 sever.

in training, i use umd dataset , and reset the parameter after copying configs.

config = edict()
config.margin_list = (1.0, 0.5, 0.0)
config.network = "r100"
config.resume = False
config.output = None
config.embedding_size = 512
config.sample_rate = 1.0
config.fp16 = True
config.momentum = 0.9
config.weight_decay = 5e-4
config.batch_size = 6
config.lr = 0.1
config.verbose = 2000
config.dali = False
config.rec = "../_datasets_/faces_umd"
config.num_classes = 10575
config.num_image = 494414
config.num_epoch = 5
config.warmup_epoch = 0
config.val_targets = ['lfw', 'cfp_fp', "agedb_30"]

the log file show the result like this:training.log and the model is here: model

when i want to use the model on few pics selected from IJBC, the result came 100%.

Time: 0.00 s. 
Time: 0.00 s. 
files: 300
batch 0
batch 1
batch 2
Time: 9.90 s. 
Feature Shape: (300 , 1024) .
(300, 512) (300,)
Finish Calculating 0 template features.
Time: 0.01 s. 
Finish 0/1 pairs.
Time: 0.00 s. 
+-----------+--------+--------+--------+--------+--------+--------+
|  Methods  | 1e-06  | 1e-05  | 0.0001 | 0.001  |  0.01  |  0.1   |
+-----------+--------+--------+--------+--------+--------+--------+
| ijbc-IJBC | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 |
+-----------+--------+--------+--------+--------+--------+--------+```

i couldn't figure it out , and hope someone had solutions.

it seems only few position pair in this TPR@FPR test, maybe you should include more images to make test more reasonable.

HHCorp commented 1 year ago

what's more , when i run the code python inference.py --weight work_dirs/face_umd_r100/model.pt --network r100 the result cames like this, there are only 200 numbers in all , but i don't know what to do.

  -0.14197479 -0.90856093  0.2612555   2.5996816  -0.8645787   1.0984787
  -1.5481728   1.1185685   1.1716927  -0.5919134  -0.86590195  0.4793761
   0.20306966  0.31248182 -0.7318979   0.10985775 -0.69324315 -1.2252758
   0.9595894   2.4714372   0.9033031   1.65573     1.5275675   1.9243096
   0.30653664 -0.3621285  -0.22210565 -0.9324069  -3.2201917   0.4331724
   0.21072745  0.7518698   0.40516305 -0.63688517 -0.50916415  0.73849624
   0.39027804  0.34557724  1.4237952   0.52682006 -0.35319164  1.1468838
   1.7897052  -0.08494984 -3.0109212   1.6659174  -1.9667882  -1.2440423
   2.1366186   2.1304927  -0.37914786 -0.7269743   0.8108647   0.9784584
  -1.6757375   1.2260991  -2.3573203   1.7145627   1.0278714  -1.3689977
  -0.69652265 -1.673241   -0.23464157  0.827121    2.085239   -0.13495259
   2.4772122  -1.829716    1.1441706  -1.096763    1.6096206   0.7074399
   0.3498694   0.37551004  0.29052967 -1.5270598   0.6691264  -0.9401472
  -1.4271096   1.0869007  -1.4407881   0.8018797  -1.486573   -0.14876139
   0.39238277  0.5260103   1.1262397   1.4627178  -0.46582985  0.28019136
  -1.1019486   1.5502445  -3.7200494   0.13907395 -0.77018434  1.5063859
  -0.42647538 -2.7784085  -0.51739246  0.1862042   0.79934347  0.74231964
   0.12219442  0.45059732  0.862764   -1.5644227   1.7897822   0.7287775
   1.2232933  -0.46182632 -2.4194262   1.4597034   2.119485    0.4852753
  -0.24113019  0.80735     0.15052532  0.63328546  1.7220626  -2.3073342
   0.51064366 -0.97686666 -0.20707193 -1.166993    0.6556213   0.45225808
   2.4424076   0.53898114  0.6369736   0.09511925  0.9471057  -1.1503799
   0.08223129 -0.5421476  -0.0612626  -1.1479584  -2.5161614   1.2851589
  -1.97923     0.11969204 -0.6341834  -0.3547518   0.8522022  -0.6718146
   2.1875858  -0.2715869   0.68040913 -0.45416573  0.17012958 -1.8134416
  -0.5063244   0.25494152 -0.29034883 -0.4156903  -2.8594334   0.10826248
   0.36027417 -1.2161708  -1.5655534   1.1249104  -0.13304886  0.14267042
   1.188729    0.99348867  1.0221468  -0.57966906 -1.9867332   1.4589988
   1.7214968   0.8131395  -0.17921506 -0.21746238  1.1440798  -0.5321635
  -1.226887   -1.1130831   0.09922308 -0.58225113 -1.8647765   0.58119416
  -0.12439682  0.06047543  0.61621535  1.3933288   0.97782165  1.6939999
  -2.0418587  -0.04077145 -3.745717   -0.9432432  -0.36207813  0.72862476
  -0.78732896  1.5361986   1.8630766   0.4465176   0.19769527  0.63913816
   1.3576422  -0.65597624  2.2250514   1.30703    -0.44994417 -2.2717416
  -0.5714991   1.8127705  -0.86886215  0.03884459  0.20872341 -1.6363363
   0.77186567  1.2802613   3.0081882  -0.9313274   0.9855726   0.47933003
   1.1960621  -1.5032662   1.4641418  -0.63546544 -1.8390664   0.8940129
  -0.37845048 -1.659115    0.6662984  -0.05085506  1.3265948   0.0377506
   0.6227851   1.2222685  -0.59811807 -1.5168611   1.4611329  -3.4309669
   1.7449596  -1.1654227   1.1313052   2.1832542  -0.4607211  -0.47714052
  -1.187059    1.3303441   1.1067159  -1.002934   -0.8986804  -0.10682251
  -1.192637   -0.36964417 -1.0398211   0.7834925   0.24426402 -0.10491683
   0.15511002 -0.705989    0.792656    1.9362661  -1.7136568  -0.36898094
  -2.119776    1.5569715   1.5491294   1.5076641   0.97855747 -1.9341758
  -2.205656    1.9023049   1.3639154   1.7040371   1.1560448   0.19118619
   1.2215725  -1.3706273  -0.3444504   0.70214367  1.3320069  -2.6555367
  -1.636137    1.238437   -3.0184712  -0.38138422 -0.5404247  -0.94531363
   0.04947946  1.5353955  -0.9148663  -0.27081856  0.7918925  -0.71484184
  -0.37999013  1.1978077   0.29071787 -0.36307222 -0.5188514   1.2711834
  -1.516248    0.64300203  0.9850636  -0.23884752  1.305709   -1.0388064
  -0.4388711   1.9157523  -1.6117725  -1.0291632  -1.7321966  -3.1830566
  -1.0535043   1.1002122  -1.5115194  -2.0747805  -1.3357145  -0.1989307
   2.547138   -0.19410777  0.2489086   1.4534051  -1.3778976  -1.3895962
  -3.3970342   0.12530693  1.7656265  -2.2455158  -1.4245323  -0.4269297
   1.9038008   1.0621073  -0.63229996 -0.20192151  0.17258583  1.219417
  -0.1149663   0.6173275   1.8875296  -0.56104684 -1.4668345   0.6517423
   0.7670969   1.3353549   0.41950515  1.3349037   1.3841734  -0.9001517
   0.0048165  -0.59564006 -0.9405832   0.76407194 -0.01644389 -1.0778792
  -0.21250142  0.1299661   0.8748573   1.5036438   2.6045835  -1.4992473
  -1.7407783  -1.4742312  -1.0052722   0.37579325 -0.8352497   0.10070888
  -0.4795149   0.01405213 -0.2968706   1.1805524   2.5484333   3.6290069
   3.5270672  -0.7222721  -1.6135048   1.5211242  -1.6163529   0.84502256
  -0.24393396 -2.5434554  -2.0756862   3.1062286   1.4804682  -3.4912636
   1.2259612  -0.5484494   0.68334603  0.35478243  0.01758031 -0.67679936
  -1.4135786   0.79385054 -2.0789833   1.4616469  -0.5817304  -0.3193594
   0.59265834 -1.8046415  -0.1574746  -0.44488838  1.5178074  -2.6413941
  -0.7170679  -0.7622535   0.47194666 -0.38689828  0.71537524  0.46894506
  -0.9253089  -1.8098103   0.15705091  0.6806365  -0.12298296 -0.8506894
  -1.3320992  -0.09465965  0.8546478  -2.0379236   0.23210374  1.2864611
  -1.115023    1.0626028  -0.8473559  -0.49522603  0.20396842 -1.7851207
   0.07553623  1.7501174   0.07093978  1.2681198   0.06610481  2.231619
   1.926362    1.7156997  -1.2103912  -3.2820034   0.33256155  0.58738434
   0.3856504   0.5730925  -1.6300728  -1.0513625   0.46999264  0.0592112
  -0.14507799 -0.04666986  1.6740463  -0.625391   -0.04774209  0.68075037
  -1.716437    2.0041106   0.35566595  0.39742932  3.878968    1.9351032
  -2.253918    1.6262021   0.38719636  1.764592   -0.99488777 -0.8639892
  -0.85770047  0.7612162  -1.1476153   0.4605851   0.79147536 -1.4109249
  -0.6360634   2.291107    0.26888803 -0.52940243  0.7351876   2.5535805
   0.03425223 -0.33321762 -0.87688434  0.57761836 -1.0517218  -0.5315786
   0.4163581   1.9749796   0.39105406  2.7493799   1.1646098  -0.1528221
  -0.2993352  -1.1022804   1.160775    0.22939645 -0.717629   -0.5306618
   1.3477471  -0.06745018 -0.9459232   0.61537963 -0.68127924  1.3017894
  -1.2358617  -0.58047736  2.354986   -3.6275318   0.50789833  0.5032583
  -0.86339283 -0.9664472  -1.6059818  -0.2954751  -0.06713548  0.45752537
   0.4217528  -1.0154542 ]]

you got 506 numbers here... ... the feature you pasted here is incomplete I believe, missing one line with 6 numbers

CMakey commented 1 year ago

what's more , when i run the code python inference.py --weight work_dirs/face_umd_r100/model.pt --network r100 the result cames like this, there are only 200 numbers in all , but i don't know what to do.

...

you got 506 numbers here... ... the feature you pasted here is incomplete I believe, missing one line with 6 numbers

thanks for u reply!! i run this command again and got 512 number, but seems different with above.

1.75900340e+00 -7.60324895e-02  1.34765577e+00 -1.08499575e+00
   1.94557101e-01 -1.13492444e-01  2.86602676e-01  4.03746754e-01
   6.89632058e-01  9.94994581e-01 -5.73002279e-01  5.39066076e-01
  -1.46311295e+00  6.22060895e-01  1.76022995e+00  6.21085107e-01
  -1.23920536e+00 -2.72208214e-01  6.30875528e-01  6.41248047e-01
  -7.88332283e-01 -5.16766965e-01 -4.29450989e-01 -9.68298197e-01
  -7.77071893e-01  3.26603746e+00  9.73035157e-01  2.04707098e+00
   9.20305550e-01  1.13213634e+00 -8.85198236e-01 -5.76769710e-01
  -9.21296328e-02 -1.16195357e+00 -1.82289100e+00  7.77973354e-01
   1.02067661e+00  5.79175726e-02  5.16451836e-01  1.39397606e-01
  -3.27563167e-01  1.46796083e+00  8.36651802e-01  3.82175475e-01
   1.89115274e+00  7.36460805e-01 -8.06843400e-01  1.01860869e+00
   2.18572950e+00  6.27861917e-01 -4.40298128e+00  2.01113749e+00
  -9.79798198e-01 -9.63993311e-01  5.16322136e-01  5.24655759e-01
   8.26422870e-01 -9.37060893e-01  3.28012466e-01 -6.67201459e-01
  -1.32433999e+00  9.56477880e-01 -7.26107121e-01  1.73469079e+00
  -4.42342311e-01 -5.79076588e-01 -3.35034728e-01 -2.17912149e+00
  -1.78229666e+00  2.15715027e+00  1.28357923e+00 -1.11447978e+00
   1.89849412e+00 -2.23427010e+00  2.42171645e-01  6.20192707e-01
   7.56160140e-01  7.62834430e-01 -1.24876583e+00 -3.52529325e-02
  -8.03534448e-01 -1.00405550e+00  1.28899848e+00 -7.55189538e-01
  -8.65692139e-01  5.43601930e-01 -1.39502323e+00  3.46995294e-01
   3.72495979e-01  5.89095235e-01  5.28909504e-01 -1.43021822e-01
   1.29158068e+00  1.01404309e+00  5.24685718e-02  7.07785606e-01
  -9.92524743e-01  1.06645405e+00 -3.31768799e+00  1.17010665e+00
  -2.12201308e-02  2.12407207e+00  4.52856362e-01 -2.13094807e+00
   1.22240877e+00  1.76483440e+00  5.63757479e-01  2.27482840e-02
  -7.82578051e-01  6.55233979e-01 -1.68829048e+00 -1.97470605e+00
   1.15701902e+00  1.73967704e-01 -3.92328799e-01 -1.08283997e+00
  -4.23514605e-01  1.08190870e+00  1.95784342e+00  7.40396738e-01
   3.15758914e-01  1.24500239e+00 -1.83886051e-01 -4.03300762e-01
   9.63891506e-01 -2.66965342e+00 -1.08974135e+00 -6.47073984e-01
  -4.20866877e-01  4.16669101e-02  8.49810839e-01 -3.79641682e-01
   2.59560299e+00  5.98434627e-01  1.56527415e-01  9.97580409e-01
   1.54037014e-01 -2.14870143e+00  1.50862932e+00  1.11552691e+00
  -5.53656481e-02 -7.64868081e-01 -1.87906563e+00 -2.33234227e-01
   8.24601233e-01 -9.07707214e-01 -4.32973951e-01 -1.68721700e+00
   1.23623168e+00 -2.51207888e-01  1.23811102e+00  2.35921681e-01
   4.45801646e-01 -9.52544808e-01 -3.31173278e-02 -2.10778141e+00
   4.41052318e-01  1.64004862e-01  2.00633675e-01 -5.18458903e-01
  -3.22922134e+00  2.38281751e+00 -9.11737144e-01  6.25919461e-01
  -2.02143624e-01  1.52351081e+00  8.71844411e-01 -2.43385211e-01
  -8.49280506e-02  1.05988121e+00  1.83530784e+00 -2.30041528e+00
   1.40521213e-01  7.97963321e-01  1.82160223e+00  4.02089119e-01
  -1.67293236e-01  1.87644828e-02  1.16083391e-01 -1.42770684e+00
   6.14264548e-01 -2.52290398e-01  4.14186388e-01 -8.00498366e-01
  -3.00527096e+00 -6.18923366e-01 -2.17158389e+00  1.04483557e+00
  -5.66190720e-01  7.34840930e-01  5.15204251e-01  5.56284308e-01
  -2.09723878e+00  1.15115583e+00 -1.10679615e+00 -4.25647855e-01
   8.58844280e-01  1.12773860e+00  3.92792225e-01  1.50818968e+00
   8.81523192e-01  5.06254733e-01  2.12855029e+00 -8.24182689e-01
  -6.23940408e-01  3.69376481e-01  7.31829882e-01  2.11863950e-01
  -3.22329223e-01 -1.09455287e+00 -3.03726941e-01  1.15464973e+00
   5.92558719e-02  7.71303236e-01 -3.19416970e-01 -2.24378204e+00
  -9.95674074e-01  9.79117930e-01  2.53102446e+00  4.09687936e-01
   8.75167489e-01 -1.08757794e+00  6.02834940e-01  5.47595501e-01
   1.99013817e+00  2.27569073e-01 -1.28223801e+00 -1.12424254e-01
  -1.09338664e-01 -1.70605540e+00  7.90782928e-01  1.53871334e+00
   8.81542802e-01  1.16352689e+00 -5.02888978e-01  9.98773873e-01
   1.07848799e+00 -2.70794839e-01 -1.82933703e-01 -2.59171438e+00
   2.43784285e+00 -1.38736653e+00  5.82733572e-01  1.48292851e+00
  -7.17417896e-01 -9.00878310e-01 -1.06900764e+00 -2.23889370e-02
  -4.66238171e-01 -1.62953481e-01 -1.70614541e+00 -7.78070807e-01
  -1.55341291e+00 -2.99770981e-01 -1.61867094e+00  5.90640843e-01
  -7.11393476e-01 -1.14230692e+00 -1.06358933e+00 -2.29082108e+00
   1.90523052e+00  1.34190202e+00 -1.12566185e+00 -1.62631798e+00
  -1.74330199e+00  1.76145005e+00  5.88413775e-01  4.87259477e-01
   6.37570381e-01 -2.10904884e+00 -2.73248434e+00  1.36632144e+00
   5.17102361e-01  1.81352019e+00 -1.52900949e-01  1.61472940e+00
   7.80211449e-01 -4.45352674e-01 -1.07424057e+00 -3.38628143e-01
   8.42668891e-01 -1.22865999e+00 -2.18160963e+00  1.04563057e+00
  -3.17455816e+00 -7.15807676e-02 -1.08275485e+00 -1.37529865e-01
  -4.97523583e-02  2.35528573e-01 -8.55209589e-01  6.81321993e-02
  -4.71212149e-01  7.51155466e-02  9.49396926e-04  1.19919086e+00
   1.09794879e+00  1.03884578e+00 -2.55186439e+00  9.94643629e-01
  -2.58225846e+00  8.86693716e-01  8.08496475e-01  2.30989739e-01
  -1.24884295e+00 -1.52868414e+00  4.29533243e-01  5.46709061e-01
  -7.35393822e-01 -3.64296675e-01 -1.53052461e+00 -1.97628391e+00
  -3.41788739e-01  7.84188509e-01 -2.41260886e+00 -7.48425961e-01
  -1.05449247e+00  2.88430423e-01  1.84820342e+00  6.74161792e-01
   1.41794845e-01 -5.98825574e-01  3.73886198e-01 -2.48169422e+00
  -2.27255845e+00  6.57229364e-01 -1.13276875e+00 -5.16856849e-01
  -6.26782835e-01  9.48999941e-01 -9.35740620e-02  3.50742370e-01
   2.98568994e-01 -1.01826072e+00  4.13419724e-01  1.95179069e+00
  -9.75452304e-01  1.53202677e+00  2.69419169e+00 -6.85079396e-01
  -1.90429139e+00  1.14199138e+00  7.08431363e-01  1.18132889e+00
   3.70904803e-01  1.61715853e+00  5.48776269e-01 -7.83207536e-01
   1.64842278e-01  2.39785656e-01  8.29384923e-01 -1.93895519e-01
  -1.16729867e+00 -1.81284279e-01 -2.17861629e+00 -1.83944032e-02
   6.66685343e-01  6.09669685e-01  5.20295858e-01 -3.53740901e-01
  -1.17392814e+00 -2.94326997e+00 -1.37023851e-01  9.54749465e-01
  -4.95018989e-01 -3.49337012e-01 -9.07384217e-01  1.31747472e+00
  -2.54682332e-01  1.53595412e+00  2.21261811e+00  1.39771974e+00
   2.76299238e+00  5.38638473e-01 -1.76050842e+00  7.06786215e-01
  -7.02481449e-01 -1.10682738e+00 -7.84055272e-04 -2.80038333e+00
  -1.28540707e+00  1.71843624e+00  2.44381762e+00 -2.75315881e+00
   1.28551567e+00  9.19860154e-02 -2.58155704e-01  1.78802982e-01
   9.01393533e-01 -2.08326268e+00  4.65304144e-02 -1.14663476e-02
  -6.55864000e-01  1.34701610e-01 -5.88080943e-01  1.55230534e+00
   9.63730633e-01 -1.20693719e+00 -2.06381734e-02  2.99363166e-01
   1.93804479e+00 -3.03180647e+00  4.48871315e-01 -5.11733830e-01
   1.49265826e+00 -2.96806842e-01 -1.63406193e-01  6.51193798e-01
  -1.74828672e+00 -1.65763772e+00 -1.25743532e+00 -7.76271641e-01
  -7.30356336e-01 -1.42850906e-01 -1.21008182e+00  4.31737840e-01
   1.73356509e+00 -1.00885141e+00  5.99896491e-01  1.76815796e+00
   1.70825934e+00  1.84353828e+00 -6.87059820e-01  5.29210627e-01
   1.07635176e+00 -1.11646628e+00  1.71096578e-01 -4.89157289e-01
  -1.22443688e+00  9.23360884e-01  2.66687535e-02  1.14623439e+00
   6.79304957e-01  1.81946516e-01 -2.04774499e+00 -2.12440252e+00
  -9.29764450e-01 -4.34975803e-01  1.36932423e-02 -3.67694259e-01
  -1.00733125e+00 -1.16315138e+00  2.26713800e+00 -1.31239080e+00
   5.92628837e-01  8.70307237e-02  2.06333899e+00 -1.66019177e+00
  -3.97760153e-01 -1.26266098e+00 -7.93232262e-01  1.30879915e+00
   9.85410452e-01 -1.19214833e-01  2.12099695e+00  8.54894280e-01
  -1.04825747e+00  4.67185020e-01  2.45230627e+00  3.27637792e-01
   4.10351396e-01  3.18526089e-01 -6.41824365e-01  9.89028871e-01
  -3.61118078e-01  5.67802906e-01  8.15989852e-01  3.40589285e-01
  -2.75196403e-01  8.75216424e-01  8.48937750e-01  7.71055929e-03
   1.55311751e+00  2.54990363e+00  3.77822310e-01  2.52370447e-01
  -1.15722322e+00  7.29334056e-01 -7.09478378e-01 -1.28088033e+00
  -7.55041480e-01  3.00284553e+00  5.56356490e-01  8.64883602e-01
   1.59248769e+00 -7.46075869e-01 -1.23655856e+00 -5.81199944e-01
   5.60756862e-01  3.34257066e-01  9.54841077e-01 -7.23705664e-02
   6.81679666e-01 -6.59659088e-01  1.19679809e+00  2.49275714e-01
  -2.27278852e+00 -4.41923380e-01 -2.12598860e-01  1.01453349e-01
   3.81671041e-01 -1.17353344e+00  6.83259428e-01 -1.09273970e+00
   2.57141560e-01 -1.08354104e+00 -1.31781495e+00 -8.91826153e-01
  -1.97217894e+00  9.93841827e-01 -1.92777868e-02  5.42257667e-01

and for the accuracy problem, when i just replace pics without changing the meta file of IJBC , it still show 100% accuracy...

i even input a dog pic...😩

HHCorp commented 1 year ago

and for the accuracy problem, when i just replace pics without changing the meta file of IJBC , it still show 100% accuracy...

i even input a dog pic...weary

thats weird... maybe you need to debug the verification func and check if the feats of pairs and similarity scores is normal.