noahzn / Lite-Mono

[CVPR2023] Lite-Mono: A Lightweight CNN and Transformer Architecture for Self-Supervised Monocular Depth Estimation
MIT License
540 stars 61 forks source link

learning rate issue #99

Closed wtpro closed 10 months ago

wtpro commented 10 months ago

Hi,

Thanks for such amazing work!

Just out of curiosity I set learning rate for both pose network and depth network to be 0 all the time. I noticed that the loss still decreases during training.

image

This is very confusing to me. I have made no change to your code in this case. Any insight on this situation?

noahzn commented 10 months ago

Hi, what is your purpose in setting the learning rate to 0?

wtpro commented 10 months ago

Hi,

It was just a way to isolate a problem.

I was trying to train part of the encoder network during test time and see what would the predicted depths look like, and they were quite bad. I suspected that it could be attributed to some other parts that I want frozen during test time but are not.

So i tried learning rate 0 and the loss was still changing. Any idea what causes this?

noahzn commented 10 months ago

Ok, I see your point. However, if the learning rate is set to 0, some layers (e.g. batchnorm layers) will still update parameters. So the loss changes.

wtpro commented 10 months ago

And the data is shuffled, which contributed too! Thanks for your reply.