meng-tang / rloss

Regularized Losses (rloss) for Weakly-supervised CNN Segmentation
MIT License
208 stars 46 forks source link

Loss definition: from the paper to the implementation #14

Open ReubenDo opened 4 years ago

ReubenDo commented 4 years ago

Hello,

First I would like to say that I really enjoyed your work. Congrats! I have a simple question regarding the denseCRF loss. I do not understand how your implementation works. Why is it a simple dot product? I am probably missing what AS is in your code.

https://github.com/meng-tang/rloss/blob/1caa759e568db2c7209ab73e73ac039ea3d7101c/pytorch/pytorch-deeplab_v3_plus/DenseCRFLoss.py#L34

Thanks a lot, Reuben

ReubenDo commented 4 years ago

Does it in fact correspond to the formula in the appendix? image

mkusner commented 4 years ago

+1 I have the same question!

mkusner commented 4 years ago

Oh actually it looks like #3 talks about this. Is it right @meng-tang that the loss is:

image

Or did I miss something?

meng-tang commented 4 years ago

Yes, it corresponds to the formula. Sorry for inconsistent notations. A or w denotes the affinity matrix. S or X means the segmentation variable. AS is the result of the product of affinity matrix and segmentation.

Note that for dense CRF, we don't explicitly save A in our implementation. However, one can compute AS efficiently with fast Bilateral filtering.