lfz / DSB2017

The solution of team 'grt123' in DSB2017
MIT License
1.23k stars 420 forks source link

Issues with layers.py at line 210 #135

Open totesarana opened 1 year ago

totesarana commented 1 year ago

When you do

loss = classify_loss, then adding regression loss to the loss tensor, you will also modify classify_loss at the same time.

This is because the assign operation makes the loss and classify_loss share the same memory address. You should either do deep copy, or simply return classify_loss + torch.sum(torch.stack(regress_losses)) instead of creating a new loss tensor and return it