ClementPinard / FlowNetTorch

Torch implementation of Fischer et al. FlowNet training code
30 stars 6 forks source link

Adapting the code for DispNet training #12

Closed ptriantd closed 7 years ago

ptriantd commented 7 years ago

Trying to adapt the code for DispNet, I ran into the following problem: outputs = model:forward(inputs) gives nan values after a number of iterations (not always the same). Was this an issue that appeared in the case of the FlowNet? If yes, do you know what causes it?

ClementPinard commented 7 years ago

It's most likely a problem of instability due a learning rate too high.

You might want to try either lowering the learning rate or using batchNorm modules.

Also check that your data augmentation routines are not messing with the label. I assume you tried e.g. to apply a lateral translation to 1 of the input, and adding the x value to disparity ground truth. Make sure that translations are correctly applied, especially for label, which may need one also, depending on what input you applied the translation too.

Had the same probleme with flow, and got either bad results or NaN outputs.

Good Luck !