sentinel-hub / field-delineation

Field delineation with Sentinel-2 data from Sentinel-Hub and a ResUnet-a architecture.
MIT License
149 stars 53 forks source link

Calling Softmax twice? #4

Closed sen-pai closed 3 years ago

sen-pai commented 3 years ago

Hi, thanks for this clean implementation of ResUnetA! In niva_models.py the last layer is the activation function Softmax for all extent, boundary, and distance And logits are returned. But in TanimotoDistanceLoss the default value of from_logits is True, causing y_pred (already logits) to pass through another Softmax. Have I misunderstood something?

Also in the original paper, there were 4 outputs (HSV, extent, boundary, and distance) is there some reason hsv was omitted?

devisperessutti commented 3 years ago

Hi @sen-pai ,

You are correct, there is an issue with the notebook published, as it should be TanimotoDistanceLoss(from_logits=False). Thank you for reporting!

We will be releasing very shortly an updated version of this code where this bug will be fixed. You cna find the ResUnetA architecture in eo-flow as well now.

Re: HSV, we decided to skip it in the first iteration since it was not that clear to us the processing required to get from the B-G-R-NIR bands to the HSV. We also thought the conditioned losses would have a larger impact than the loss on HSV. Feel free to make a MR with the changes/suggestions to add the HSV input.

sen-pai commented 3 years ago

Thanks for the quick reply! @devisperessutti
I have written my own PyTorch implementation and was comparing code to check for bugs in my code.

The paper uses a hybrid version of tanimoto loss (tanimoto with complement) which is = (tanimoto(target, pred) + tanimoto(1-target, 1-pred))/2 Is there some reason this version was not used? Thanks again for this implementation

devisperessutti commented 3 years ago

if I'm not mistaken this should be already taken into account since both y_true and y_pred are in a one-hot encoding format, where the first channel has (1-target) and the second has target, and these are then reduced in the loss.

sen-pai commented 3 years ago

I guess I missed that Thanks for the clarification.