lmb-freiburg / flownet2

FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks
https://lmb.informatik.uni-freiburg.de/Publications/2017/IMKDB17/
Other
1k stars 318 forks source link

Finetuning flownet2 on KITTI dataset #100

Closed lan1991 closed 6 years ago

lan1991 commented 6 years ago

HI,

Does fine-tuning on KITTI means only fine-tune the parameters of Fusion network? The KITTI dataset only provides 394 flow gt images in total, did you use additional training data when training Flownet2-KITTI? Also, how did you use the sparse gt data to train the network?

Thx!

nikolausmayer commented 6 years ago

Hi, for "FlowNet2-ft-kitti" we finetuned the entire network. We did not use any other data, only our usual augmentation as described in the paper. You're right that there are very few samples; that's also why we cannot train on KITTI from scratch. Our loss and backpropagation just ignore invalid GT pixels.

lan1991 commented 6 years ago

Hi, Thx for your reply! Can you provide the training network prototxt? Because the downloaded training prototxt template of FlowNet2 model has blobs like "img0_b" and "img1_b" which I don't know the exact meaning. Is the finetuning strategy for FlowNet-kitti (such as lr, iters...) the same as Solver_fine?

nikolausmayer commented 6 years ago

That prototxt file really is the one we were using to train the networks. There are multiple input blobs because we were mixing datasets at some points. Eddy Ilg wanted to upload a better usable version. I'd like to humbly suggest you contact him directly :)

lan1991 commented 6 years ago

thx! @nikolausmayer