lmb-freiburg / flownet2

FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks
https://lmb.informatik.uni-freiburg.de/Publications/2017/IMKDB17/
Other
1k stars 318 forks source link

Finetune Problem Caused by KITTI Sparse GT #153

Closed Elena-ssq closed 6 years ago

Elena-ssq commented 6 years ago

Hi,

I'm trying finetune the network on KITTI dataset while the results are greatly affected by the sparse ground-truth. I did notice that points with unavailable gt should not be considered during loss computing, but I didn't find a parameter to mask them out.

Is there any suggestion on how to achieve the finetune on KITTI?

Tkx

nikolausmayer commented 6 years ago

Hi, The loss layer ignores NaN values (see https://github.com/lmb-freiburg/flownet2/blob/master/src/caffe/layers/l1loss_layer.cu#L22), but the invalid pixels in KITTI are "zero", not "NaN" (it's a PNG format which cannot represent that, I think). You should be fine if you take that into account when converting your data.

Elena-ssq commented 6 years ago

Hi,

The problem is solved by following your suggestion.

Thanks again!

Elena-ssq commented 6 years ago

Hi,

The results are really blurred after finetune by killing (set to zero) all diff without GT. The parameter normalize_by_num_entries in l1loss layer was assigned to true, and remained others as false.

Any clue why this happens? Thanks a lot!

nikolausmayer commented 6 years ago
Elena-ssq commented 6 years ago

Please help.