2gunsu / monocon-pytorch

Unofficial Pytorch Implementation for MonoCon(AAAI, 2022)
Apache License 2.0
86 stars 10 forks source link

loss increasing #24

Open FlyingAnt2018 opened 1 year ago

FlyingAnt2018 commented 1 year ago

Hi, bro. I have just adjusted the epoch from 200 to 300, when training process going to epoch 100, all loss items have a significant increase. Do you know the reason ? Thanks. image

SrinjaySarkar commented 1 year ago

Yeah I have observed the same exact thing. Could you find any reason @FlyingAnt2018 ?

FlyingAnt2018 commented 1 year ago

Yeah I have observed the same exact thing. Could you find any reason @FlyingAnt2018 ?

lr too large

SrinjaySarkar commented 1 year ago

Could you provide the the lr you used to train it on 300 epochs ? @FlyingAnt2018

2gunsu commented 11 months ago

Hello. I'm sorry for the delay in answering. 😢 I'll check what you said and get back to you soon.

Thank you.

sunhucheng commented 11 months ago

Hello. I'm sorry for the delay in answering. 😢 I'll check what you said and get back to you soon.

Thank you. I have observed the same problem, have you solved it? I think it's a matter of learning rate. @2gunsu

sunhucheng commented 11 months ago

I've found that this doesn't happen when the batch size is 8, but it does happen when it's 24. it can resume training from ckpt before loss incressing then loss will decreases.

QSQSQSQS commented 3 months ago

I have observed the same exact thing while training with nuscenes, but it didn't happen with KITTI.