tim-learn / SHOT

code released for our ICML 2020 paper "Do We Really Need to Access the Source Data? Source Hypothesis Transfer for Unsupervised Domain Adaptation"
MIT License
437 stars 78 forks source link

您的这行代码似乎有问题?initc = aff.transpose().dot(all_fea) #36

Closed RSMung closed 2 years ago

RSMung commented 2 years ago

https://github.com/tim-learn/SHOT/blob/9c02a95dbb632f15ea4926c1cc9d32e4faa20df1/object/image_target.py#L278

  1. API是torch.transpose(input, dim0, dim1) → Tensor,需要传入两个维度作为参数。 但是您的代码中写的是aff.transpose(),是否不太合理?

  2. 同样是这行代码,dot()函数要求参与运算的两个变量都是一维的变量。可以见pytorch文档: https://pytorch.org/docs/stable/generated/torch.dot.html#torch.dot 但是这里all_fea和aff因为有batch size的存在至少也是两个维度,感觉也不太合理。作者您怎么看待?

  3. 我在看论文中Self-supervised Pseudo-labeling的对应部分,请问是def obtain_label(loader, netF, netB, netC, args)这个函数吗?

RSMung commented 2 years ago

看了您的首页From CHINA,我就直接中文了。 期待您的回复。

RSMung commented 2 years ago

突然看到是numpy操作,问题1和问题2迎刃而解了,打扰了。