isaaccorley / torchseg

Segmentation models with pretrained backbones. PyTorch.
MIT License
91 stars 7 forks source link

Using "segmentation_models_pytorch" with Unet++ #4

Open alqurri77 opened 5 months ago

alqurri77 commented 5 months ago

was trying to use segmentation_models_pytorch with Unet++. I tried to look at the example at HuBMAP - Pytorch smp Unet++. My understanding Unet++ returns 4 outputs (as part of deep supervision). But in the example, I notice it is only returning one output. I think I'm missing something here.

isaaccorley commented 5 months ago

Hi @alqurri77, I didn't write the initial smp implementation of UNet++ or the tutorial but I think you're right that the model should be able to optionally return the intermediate outputs for deep supervision. I'll make an issue and get a PR going for this. Stay tuned!

Thanks!

alqurri77 commented 5 months ago

Thank you very much, Issac!

notprime commented 4 months ago

Hi @alqurri77 ,

I'm commenting to add infos and specify what we need to be implement. Yes, you are right, for deep supervision we have to store the intermediate outputs of all hidden layers, both from encoder and decoder, so that we can compute the discrepancy between the decoder block's output and the corresponding down-sampled version from the encoder block. By the way, deep supervision can be used with any model, so we should return the ouputs if needed for each model. Now, I believe there are different papers exploring different discrepancy losses, like MSE, Cross entropy or classic segmentation losses, there is not an heuristic on which one is better. We can simply return the necessary feature maps and leave the loss computation in the hands of the user, what do you think @isaaccorley ?

A couple of references:

isaaccorley commented 4 months ago

Yes, my initial thoughts for this are to have an optional arg which tells the decoder and overall model to return the encoder and intermediate decoder outputs in the forward call.

alqurri77 commented 4 months ago

Yes, that sounds good! @notprime @isaaccorley