ClementPinard / SfmLearner-Pytorch

Pytorch version of SfmLearner from Tinghui Zhou et al.
MIT License
1.01k stars 226 forks source link

Questions about Smooth Loss Weight and batch size #105

Closed alexlopezcifuentes closed 3 years ago

alexlopezcifuentes commented 4 years ago

Hi Clement!

Thanks a lot for the work coding this paper in PyTorch. I have two questions:

Thanks in advance!

ClementPinard commented 4 years ago

Hi, thanks for your interest in my repo

  1. The smooth loss is computed with scale taken in count. The term we add here is the sum of all scale aware smooth losses that are computed in this function : https://github.com/ClementPinard/SfmLearner-Pytorch/blob/master/loss_functions.py#L70 (notice the weight /= 2.3 each time we go down a scale)
  2. Empiric tests showed that batch size produced the best results. For my tests (2018), it was enough so that the CPU was not the bottleneck and the GPU was used at 100% computing capacity (not memory). Now for more recent works, there has been some tests with mutli gpu. You can have a look at e.g. competitive collaboration : https://github.com/anuragranj/cc