QinbinLi / MOON

Model-Contrastive Federated Learning (CVPR 2021)
MIT License
263 stars 56 forks source link

Hi, #23

Closed judge-x closed 6 months ago

judge-x commented 7 months ago

Thank you for your wonderful work. I a little confused about the code:

` for previous_net in previous_nets: previousnet.cuda() , pro3, _ = previous_net(x) nega = cos(pro1, pro3) logits = torch.cat((logits, nega.reshape(-1,1)), dim=1)

            previous_net.to('cpu')`

First, In paper, the negetive representation seem from the local model of last round, instead of all previous rounds. I want to know if the code here makes sense. Second, "logits = torch.cat((logits, nega.reshape(-1,1)), dim=1)", what is the purpose of splicing? Looking forward to your reply, Thank you!!!

QinbinLi commented 6 months ago

Hi @judge-x ,

  1. previous_nets is a list with only size 1, which only includes the local model of last round by default.
  2. This is to easily compute the contrastive loss by cross-entropy loss. By splicing the positive and negative pairs, the contrastive loss can be computed by using cross-entropy loss in line 311.
judge-x commented 6 months ago

ok, thank you for your answer :-)