PingoLH / FCHarDNet

Fully Convolutional HarDNet for Segmentation in Pytorch
MIT License
195 stars 52 forks source link

Bug in _bootstrap_xentropy_single #30

Closed ismael-elatifi closed 4 years ago

ismael-elatifi commented 4 years ago

In loss.py in function _bootstrap_xentropy_single, there is a bug in the following code portion :

        if sorted_loss[K] > thresh:  # the bug is here, it should be `sorted_loss[K] <= thresh`
            loss = sorted_loss[sorted_loss > thresh]
        else:
            loss = sorted_loss[:K]

Because with your current code, if the Kth loss value is below threshold it will be kept (else clause), whereas you don't want that.

PingoLH commented 4 years ago

Hello, thank you for the comment. I think it is just a different logic for loss extraction. Our implementation is a little bit different than the original bootstrap loss. Here is our logic:

  1. extract all losses larger than the threshold
  2. at least K elements for each instance of a mini-batch must be extracted because it would be biased to train the model with only a few losses.
ismael-elatifi commented 4 years ago

Ok I see, my bad. So indeed it is not a bug. I close the issue which is actually not one :-)