Open 994374821 opened 5 years ago
@994374821 Thank you for your good question. "skip_ == true" means some unexpected situation happened, which is rarely happened. If this situation happened unfortunately, we choose to skip the diff of current batch instead of backward the diff. Because the training dataset is noisy, we don't want to learn noisy dataset directly.
I have some confusion about backward of NoiseTolerantFRLayer, why when skip_ is True, bottomdiff multiply zero? why not just return as the situation "iter<startiter"?
` void NoiseTolerantFRLayer::Backward_cpu(const vector<Blob>& top,
const vector& propagate_down,
const vector<Blob >& bottom)
{
if (propagate_down[0])
{
const Dtype label_data = bottom[2]->cpu_data();
const Dtype top_diff = top[0]->cpu_diff();
Dtype bottom_diff = bottom[0]->mutable_cpu_diff();
const Dtype weightdata = weights.cpu_data();
}`