pshashk / seesaw-facenet

SeesawFaceNets: sparse and robust face verification model for mobile platform
https://arxiv.org/abs/1908.09124
13 stars 4 forks source link

Li-ArcFace implementation #3

Open rose-jinyang opened 4 years ago

rose-jinyang commented 4 years ago

Hello how are you? Thanks for contributing this project. I have a question. Where can I find Li-ArcFace implementation? Thanks.

pshashk commented 4 years ago

Hello Fine. I've used this code for Li-ArcFace:

import torch
from torch import nn
import torch.nn.functional as F
from math import pi

class LiArcFace(nn.Module):
    def __init__(self, num_classes, emb_size=512, m=0.4, s=64.0):
        super().__init__()
        self.weight = nn.Parameter(torch.empty(num_classes, emb_size))
        nn.init.xavier_normal_(self.weight)
        self.m = m
        self.s = s

    def forward(self, input, label):
        W = F.normalize(self.weight)
        input = F.normalize(input)
        cosine = input @ W.t()
        theta = torch.acos(cosine)
        m = torch.zeros_like(theta)
        m.scatter_(1, label.view(-1, 1), self.m)
        scale = -2 * self.s / pi
        return self.s + scale * (theta + m)
rose-jinyang commented 4 years ago

Hello Thanks for your reply. I tried to train several models with ArcFace but it is difficult to converge well. Is the Li-Arcface effective even in high dimensional embedding learning as well as low dimensional? Could u provide the training code too? Thanks

pshashk commented 4 years ago

Unfortunately, I can't provide the training code. For training I used ADAM + LR scheduler (linear warmup + cosine decay).

BRO-HAMMER commented 3 years ago

@pshashk Hello, I'm trying to implement the Li-ArcFace loss variant in keras. the relevant part of my code for normal ArcFace looks like this:

# ...
theta = tf.acos(K.clip(logits, -1.0 + K.epsilon(), 1.0 - K.epsilon()))  # to avoid NaN during backprop
# target_logits = tf.cos(theta + self.m)
target_logits = tf.cos(K.clip(theta + self.m, 0, math.pi))  # to avoid possible leakage (neg penalty) when t + m > pi
logits = logits * (1 - y_true) + target_logits * y_true
logits *= self.s
# ...

so, please correct me if i'm wrong (I'm not familiar with pytorch), li-arcface would look like:

# ...
theta = tf.acos(K.clip(logits, -1.0 + K.epsilon(), 1.0 - K.epsilon()))  # to avoid NaN during backprop
target_logits = theta + self.m
logits = logits * (1 - y_true) + target_logits * y_true
logits = self.s * (math.pi - 2 * logits) / math.pi
# ...

Thank you in advance!