Open alievk opened 3 years ago
In my experience it usually works better, so I start with sync bn. I did not test it with the latest model though, so not sure if it better in this case.
You mean it generally works better for training deep learning models or particularly for FOMM?
In general.
I only see a difference in equivariance_jacobian (pink w/ SyncBatchNorm, green w/o)
However, SyncBatchNorm is slow and memory consuming.
Did you explore the effect of the learning rate on the KP detector? Is the value defined in the config optimal?
Not really.
Hi Aliaksandr,
What's the point to use SynchronizedBatchNorm2d instead of vanilla BatchNorm2d?