deepinsight / insightface

State-of-the-art 2D and 3D Face Analysis Project
https://insightface.ai
23k stars 5.37k forks source link

partial fc precision #1606

Open markncx opened 3 years ago

markncx commented 3 years ago

I compared partial fc (r=0.2, pytorch version) with full softmax for my dataset (1 million identities). The precision of partial fc (r=0.2) is lower than that of softmax. Could you give some advices? Thanks!

anxiangsir commented 3 years ago

How many epochs in your setting?

markncx commented 3 years ago

How many epochs in your setting?

50 epochs

anxiangsir commented 3 years ago

Can you give us more information, such as test methods? The accuracy diff?

markncx commented 3 years ago

I apply partial fc to image retrieval task. and i use map as the test metric. the accuracy diff is 0.05.

anxiangsir commented 3 years ago

I have the following suggestions:

  1. Increasing margin after sampling may be helpful for accuracy.
  2. Sampling tends to work better when the data set is dirty.
  3. Partial FC works well on tens of millions of identites, model parallel is sufficient in your case.
markncx commented 3 years ago

I have the following suggestions:

  1. Increasing margin after sampling may be helpful for accuracy.
  2. Sampling tends to work better when the data set is dirty.
  3. Partial FC works well on tens of millions of identites, model parallel is sufficient in your case.

Thanks for you advices. I still have several questions:

  1. partial fc with negative sampling randomly samples negative classes, which means that it would miss most of the negative classes during one iteration. Does partial fc have difficulties on classifying similar identities (compared with softmax)? If not, does partial fc require more epochs to converge (maybe more epochs can help it to obtain global negative information)?
  2. the value of loss does not decrease when lr=0.0001 (it does in original softmax).