Open wtomin opened 6 years ago
I have trained using 1 - ccc and it works fine. The fluctuation part could be because of the dropout if you have any. Finally, one way to decrease the fluctuation is by decreasing the learning rate, say every 50 epochs.
Hi, I read about your work and it's great! I have one question related to the loss function that you used in your code, which is 1-ccc. I have previous experience of using 1-ccc as loss function as well, but it turns out not a very good loss function to use. What I see is that the loss on train set drops gradually, but the loss on validation set keeps fluctuating, and it won't drop finally. In the end, I have to use mse instead. So I am wondering, have you ever encountered this kind of problem during training with (1-ccc) loss function? If so , how do you solve it?