lmb-freiburg / flownet2

FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks
https://lmb.informatik.uni-freiburg.de/Publications/2017/IMKDB17/
Other
1k stars 318 forks source link

problem with downsample_layer.cpp/cu #98

Closed PkuRainBow closed 6 years ago

PkuRainBow commented 6 years ago

Hi, I have compiled the program successfully..... However, I meet a problem when I want to train the Flying chairs dataset with FlowNet2-C, where I meet the following errors related the code of the Backpropagation.

line #132 - 139 / downsample_layer.cpp & downsample_layer.cu

 template <typename Dtype>
void DownsampleLayer<Dtype>::Backward_gpu(const vector<Blob<Dtype>*>& top,
      const vector<bool>& propagate_down, const vector<Blob<Dtype>*>& bottom) {

  for(int i=0; i<propagate_down.size(); i++) 
    if(propagate_down[i]) 
        LOG(FATAL) << "DownsamplingLayer cannot do backward.";
}

Here is the case of your provided prototxt.

I am wondering why no bp is allowed and how to solve this problem?

layer {
  name: "Downsample1"
  type: "Downsample"
  bottom: "scaled_flow_gt"
  bottom: "predict_flow6"
  top: "blob28"
}
PkuRainBow commented 6 years ago

@nikolausmayer

nikolausmayer commented 6 years ago

Hi, the Downsample layer brings the groundtruth blob down to the same resolution as the predicted-flow blob. It does not do backpropagation because there is nothing to optimize here. The gradient information from the loss is backpropagated via the predict_flow bottom blob in the flow_loss layer, so it's ok to disable the pathway in the downsampling layer.

nikolausmayer commented 6 years ago

(closed due to inactivity)