Open protossw512 opened 6 years ago
@protossw512
Hi, the batch size is 8 for both CNN6 and CNN7. You can definitely use larger batch size such as 16 or 32, but you have to increase your epoch numbers. However, you should always monitor your loss and make sure that the training of the model is converged.
I have no specific reason for set values between [1, 64]. You can also normalised them to [0, 1]. But if you do so, you have to change the Wing loss function's parameters accordingly when you use the Wing loss. For L1 and L2 loss, it does not matter. Another importance thing is that, if you normalise them to [0, 1], you should also increase the learning rate.
@FengZhenhua Thank you so much for your fast response, my confusions are resolved.
Hi,
I would like to know what is your batch size for training CNN6 and CNN7? Since I would like to estimate how many epochs should I train in order to get similar performance than yours.
Another question is that I noticed you are regressing absolute coordinate values at the end (0 ~ 63), instead of relative positions(0~1), do you have any specific reason for doing that?
I noticed this from your demo code:
Please correct me if I am wrong. Thank you.