yaoxufeng / PCL-Proxy-based-Contrastive-Learning-for-Domain-Generalization

MIT License
57 stars 6 forks source link

Questions on PCL loss #7

Open PhoebeChen123 opened 2 years ago

PhoebeChen123 commented 2 years ago
  1. For formula (6) in the paper, why is it z_j instead of z_i in the second term of Z?
  2. For class ProxyPLoss in losses. py, "label = torch.zeros(logits.size(0), dtype=torch.long).to(feature.device)", here, why are the labels set to be all 0?
yuxinww commented 2 years ago

pred = torch.masked_select(pred.transpose(1, 0), label) ...... logits = torch.cat([pred, feature], dim=1) 请问,为什么 logits[0][0] 计算的是第 1 个 proxy 到其同类的第一个样本(不一定是batch中的第一个样本)的距离,logits[0][1:] 则是计算第 1 个样本到其 negative 的距离?

YukiFan commented 2 years ago
  1. For formula (6) in the paper, why is it z_j instead of z_i in the second term of Z?
  2. For class ProxyPLoss in losses. py, "label = torch.zeros(logits.size(0), dtype=torch.long).to(feature.device)", here, why are the labels set to be all 0?

If you can read Chinese try reading this link for an explanation of question2 https://blog.csdn.net/lgzlgz3102/article/details/124642336