Closed bxyan closed 7 years ago
Please look into the repo.
why use class weight 1, 13,120. Can we use other number ?...On what basis you are using this weights..
class_loss_weights: 1 class_loss_weights: 13 class_loss_weights: 120
The weight given to a certain class was inversely proportional to the frequency of that class in the dataset. In our case the weight was inversely proportional to the number of pixels for each class in the entire dataset. So that the resulting contribution is roughly equal for all classes.
This was inspired from the U-Net paper. In equation (2) in https://arxiv.org/pdf/1505.04597.pdf, we use w_c as the weight. We have not used the second term.
step2_weights.caffemodel I cannot find