Closed wtpro closed 10 months ago
Hi, what is your purpose in setting the learning rate to 0?
Hi,
It was just a way to isolate a problem.
I was trying to train part of the encoder network during test time and see what would the predicted depths look like, and they were quite bad. I suspected that it could be attributed to some other parts that I want frozen during test time but are not.
So i tried learning rate 0 and the loss was still changing. Any idea what causes this?
Ok, I see your point. However, if the learning rate is set to 0, some layers (e.g. batchnorm layers) will still update parameters. So the loss changes.
And the data is shuffled, which contributed too! Thanks for your reply.
Hi,
Thanks for such amazing work!
Just out of curiosity I set learning rate for both pose network and depth network to be 0 all the time. I noticed that the loss still decreases during training.
This is very confusing to me. I have made no change to your code in this case. Any insight on this situation?