anheidelonghu / ACNet

ACNet: Attention Complementary Network for RGBD semantic segmentation
160 stars 27 forks source link

Question about loss weight #1

Closed charlesCXK closed 5 years ago

charlesCXK commented 5 years ago

Hi, Thank you for your excellent work! I have a question about the nyuv2_frq. I calculate the distribution of each category in the training set of NYU Depth V2, it looks like this:

[0.23660696, 0.1187167, 0.07648914, 0.04909946, 0.04807328, 0.03532588, 0.02892444, 0.02396406, 0.02282296, 0.0225052, 0.02245426, 0.01931515, 0.01841655, 0.01462058, 0.01229422, 0.01178623, 0.01178341, 0.01145094, 0.01054233, 0.00951471, 0.00907144, 0.00895788, 0.00728474, 0.00719785, 0.00619533, 0.00517752, 0.00460452, 0.00431567, 0.00439997, 0.00389129, 0.00374951, 0.00374431, 0.00360172, 0.00359986, 0.00353324, 0.00334604, 0.00306539, 0.02311192, 0.02512652, 0.06131882]

From this, I find that some categories are very small. Could you teach me how to convert this freq to yours (listed below)? Thank you !

 nyuv2_frq = [0.04636878, 0.10907704, 0.152566  , 0.28470833, 0.29572534,
        0.42489686, 0.49606689, 0.49985867, 0.45401091, 0.52183679,
        0.50204292, 0.74834397, 0.6397011 , 1.00739467, 0.80728748,
        1.01140891, 1.09866549, 1.25703345, 0.9408835 , 1.56565388,
        1.19434108, 0.69079067, 1.86669642, 1.908     , 1.80942453,
        2.72492965, 3.00060817, 2.47616595, 2.44053651, 3.80659652,
        3.31090131, 3.9340523 , 3.53262803, 4.14408881, 3.71099056,
        4.61082739, 4.78020462, 0.44061509, 0.53504894, 0.21667766]
anheidelonghu commented 5 years ago

The code actually uses the weight from https://github.com/tum-vision/fusenet/blob/master/fusenet/data/nyuv2_40class_weight.txt . I also tried the weight calculated by myself, but it worked not well.

charlesCXK commented 5 years ago

The code actually uses the weight from https://github.com/tum-vision/fusenet/blob/master/fusenet/data/nyuv2_40class_weight.txt . I also tried the weight calculated by myself, but it worked not well.

Thank you very much!