Sha-Lab / FEAT

The code repository for "Few-Shot Learning via Embedding Adaptation with Set-to-Set Functions"
MIT License
418 stars 84 forks source link

Questions with respect to the calculation of the KL divergence. #15

Closed d12306 closed 5 years ago

d12306 commented 5 years ago

Hello, @Han-Jia , Thanks for your implementation, but when you calculate the KL divergence, why do you formalize the attention labels like that? Why this implementation can reflect the property of the contrastive loss proposed in the paper?

    # construct attention label
    att_label_basis = []
    for i in range(args.way):
        temp = torch.eye(args.way + 1)
        temp[i, i] = 0.5
        temp[-1, -1] = 0.5
        temp[i, -1] = 0.5
        temp[-1, i] = 0.5
        att_label_basis.append(temp)

Could you please explain it a little bit more, or is there any reference I can have a look at?

Thanks,

Han-Jia commented 5 years ago

Hi,

As mentioned in the paper, FEAT and FEAT* use different regularizers.

Since the specific test instance is incorporated in the transformer adaptation in FEAT*, which provides a kind of supervision to regularize the attention. For example, the labels of the transformer input are in the form of (a,b,c,d,e,c) when we have a 5-way 1-shot support set and the test instance comes from class c. In this case, we want to regularizer an input attended to its same class instances. In other words, we want the attention of the 3rd instance concentrated on itself and the test instance.

d12306 commented 5 years ago

@Han-Jia , Thanks, but I think the there is always some 1 in the attention label matrix, can they be substitued by 0.5? because they actually denote the embeddings belong to the same class,just as the same as the 0.5 means.

Han-Jia commented 5 years ago

Hello, The att_label_basis is constructed by setting part of the identity matrix to zero, so it contains 1. You can also build this matrix by matching the labels of the test instance and the support set.

d12306 commented 5 years ago

Thank you so much. Appreciate your help.