EIDOSLAB / entangling-disentangling-bias

EnD: Entangling and Disentangling deep representations for bias correction | CVPR21 https://doi.org/10.1109/CVPR46437.2021.01330
https://openaccess.thecvf.com/content/CVPR2021/html/Tartaglione_EnD_Entangling_and_Disentangling_Deep_Representations_for_Bias_Correction_CVPR_2021_paper.html
MIT License
16 stars 3 forks source link

About the order for "sum" and "abs" in End.py #2

Closed bregmangh closed 3 years ago

bregmangh commented 3 years ago

Nice work! But I'm a little confused. According to equation (7) in the paper, in End.py: orthogonal_loss += torch.abs(torch.sum(gram bias_mask)) should be: orthogonal_loss += torch.sum(torch.abs(gram bias_mask)) ??

carloalbertobarbano commented 3 years ago

Hi, sorry for the delay in the response! I was on vacation.

You are correct, this comes from an older version of our codebase. However, when using ReLU activations the two forms are equivalent.