Closed sunpeng1996 closed 6 years ago
Hi! What do you mean the distribution is too large? If you mean that the values are higher than 1, that is ok, the softmax will squash all values to probabilities-range anyway.
Are you training encoder or the full network (decoder mode)?
For this 7% drop did you do any modification to the network or is it only changing the weights?
You can try the ones that are uploaded in the my erfnet_pytorch code (here):
if (enc): #encoder
weight[0] = 2.3653597831726
weight[1] = 4.4237880706787
weight[2] = 2.9691488742828
weight[3] = 5.3442072868347
weight[4] = 5.2983593940735
weight[5] = 5.2275490760803
weight[6] = 5.4394111633301
weight[7] = 5.3659925460815
weight[8] = 3.4170460700989
weight[9] = 5.2414722442627
weight[10] = 4.7376127243042
weight[11] = 5.2286224365234
weight[12] = 5.455126285553
weight[13] = 4.3019247055054
weight[14] = 5.4264230728149
weight[15] = 5.4331531524658
weight[16] = 5.433765411377
weight[17] = 5.4631009101868
weight[18] = 5.3947434425354
else:
weight[0] = 2.8149201869965 #road
weight[1] = 6.9850029945374 #sidewalk
weight[2] = 3.7890393733978 #building
weight[3] = 9.9428062438965 #wall
weight[4] = 9.7702074050903 #fence
weight[5] = 9.5110931396484 #pole
weight[6] = 10.311357498169 #traffic light
weight[7] = 10.026463508606 #traffic sign
weight[8] = 4.6323022842407 #vegetation
weight[9] = 9.5608062744141 #terrain
weight[10] = 7.8698215484619 #sky
weight[11] = 9.5168733596802 #person
weight[12] = 10.373730659485 #rider
weight[13] = 6.6616044044495 #car
weight[14] = 10.260489463806 #truck
weight[15] = 10.287888526917 #bus
weight[16] = 10.289801597595 #train
weight[17] = 10.405355453491 #motorcycle
weight[18] = 10.138095855713 #bicycle
Oh,thanks o lot! I will try yours. The distribution is too large means that the first class weight is 0.0819 ,and someone is 5.2286, the 5.2286/0.0819 is too big,please look at my weights.
Yes, that distribution is a bit weird. What code did you use for calculation? Your weights are basically making the model "ignore" the classes with very small values (0.0819) and boost the ones with very high values. Did you try the weights from the pytorch code?
I'm closing this but if you have more questions just reopen it. Thanks!
Hi Eromera, thanks a lot for your wonderful works. Recently I have been training the ERFnet from scratch with my data. I noticed that you used different class weights for encoder trainning and decoder trainning. Could you please explain the reason behind it ? Is this just because different datasets are used for them?
hi, I use the ENET calculate_class_weighting.py generare the loss weight for my training. and I find some problems.
First, the weight is: 0.0819 0.4754 0.1324 1.5224 1.5190 2.4730 8.1865 5.2286 0.1870 1.4695 0.6893 1.9814 7.8091 0.4164 1.3809 1.1982 0.6273 5.3535 4.0939 ,the distrubution is too large.
Second, I use the weight for loss to train, And the Mean IOU drop 7%. Could you give me some tricks and help? thx a lot!