wutong16 / DistributionBalancedLoss

[ ECCV 2020 Spotlight ] Pytorch implementation for "Distribution-Balanced Loss for Multi-Label Classification in Long-Tailed Datasets"
362 stars 46 forks source link

How to use the ResampleLoss? #19

Closed crazy-zxx closed 9 months ago

crazy-zxx commented 1 year ago

I tried to introduce the ResampleLoss function into my code and found that it did not work properly.

  1. My training code uses the classic training method:
pred = model(img).sigmoid()
loss = config.loss_func(pred, label)
optimizer.zero_grad()
loss.backward()           #There is a problem here!!!
optimizer.step()
  1. I have changed the code to reduction = 'mean'.

  2. Now there are the following problems that cannot be solved:

Traceback (most recent call last):
  File "/home/xxx/run.py", line 5, in <module>
    train()
  File "/home/xxx/train.py", line 73, in train
    loss.backward()
  File "/home/xxx/miniconda3/envs/test/lib/python3.8/site-packages/torch/_tensor.py", line 307, in backward
    torch.autograd.backward(self, gradient, retain_graph, create_graph, inputs=inputs)
  File "/home/xxx/miniconda3/envs/test/lib/python3.8/site-packages/torch/autograd/__init__.py", line 154, in backward
    Variable._execution_engine.run_backward(
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [32, 15]], which is output 0 of SigmoidBackward0, is at version 1; expected version 0 instead. Hint: the backtrace further above shows the operation that failed to compute its gradient. The variable in question was changed in there or anywhere later. Good luck!

I hope I can get your help on how to use this loss function. Thanks!