Closed dsuess closed 5 years ago
My solution for 2 would look like this:
def DeepSupervision(criterion, xs, y, *, xs2=None):
"""
Args:
criterion: loss function
xs: tuple of inputs
y: ground truth
"""
loss = 0.
if xs2 is None:
for x in xs:
loss += criterion(x, y)
else:
for x, x2 in zip(xs, xs2):
loss += criterion(x, y, x2)
return loss
By forcing xs2
to be passed as a keyword argument, we make sure we get proper errors when unintentionally passing a 4th argument
@dseuss You can ignore DeepSupervision.
For example, you can change the code:
if isinstance(features, tuple):
global_loss, local_loss = DeepSupervision(criterion_metric, features, pids, local_features)
else:
global_loss, local_loss = criterion_metric(features, pids, local_features)
as
global_loss, local_loss = criterion_metric(features, pids, local_features)
It will be worked. I think you will never use the DeepSupervision apart from some special cases (such as video reid).
@michuanhaohao Thanks for your quick reply. Makes sense. Also, I never got to run that code, it was just my linter complaining about the function call with too many arguments.
I though DeepSupervision would be used for mutual learning. If that's not the case, how can I use the deep mutual learning in the current implementation?
@dseuss I have removed deep mutual learning in AlignedReID++. If you want to use mutual learning, you can refer to another repo (https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch).
Thanks a lot @michuanhaohao
The current training code should fail, e.g. here when we use deep mutual learning. The reason is that
DeepSupervision
expects a singlexs
and a singley
value.I was planing to submit a PR, but wanted to double check first what would be the right solution:
features
by concatenating local and global features?Even though 2. seems to be much easier to do, 1. feels like the right thing to do.
P.S. Thanks for this nice implementation -- I like the style and it's very easy to read.