ramdrop / stun

Implementation for the paper: STUN: Self-Teaching Uncertainty Estimation for Place Recognition
BSD 3-Clause "New" or "Revised" License
29 stars 3 forks source link

A problem about python main.py --phase=train_tea --loss=cont #7

Closed shenyehui closed 12 months ago

shenyehui commented 12 months ago

Dear authors, I got this error report while training with your code: python main.py --phase=train_tea --loss=cont, did you get the same error while training? How did you solve it? /root/miniconda3/lib/python3.8/site-packages/torch/optim/lr_scheduler.py:129: UserWarning: Detected call of lr_scheduler.step() before optimizer.step(). In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate

ramdrop commented 12 months ago

What is your PyTorch version? And could you provide a full log?

shenyehui commented 12 months ago

I am using a cloud server with the following configuration: PyTorch 1.10.0 Python 3.8(ubuntu20.04) Cuda 11.3 GPU V100-SXM2-32GB(32GB) 1。screen log:e= 0,lr=0.00001000,tl=0.0000,r@1/5/10=29.11/50.29/58.86 e= 1,lr=0.00000990,tl=0.0000,r@1/5/10=29.11/50.29/58.86 e= 2,lr=0.00000980,tl=0.0000,r@1/5/10=29.11/50.29/58.86 e= 3,lr=0.00000970,tl=0.0000,r@1/5/10=29.11/50.29/58.86 e= 4,lr=0.00000961,tl=0.0000,r@1/5/10=29.11/50.29/58.86 e= 5,lr=0.00000951,tl=0.0000,r@1/5/10=29.11/50.29/58.86

shenyehui commented 12 months ago

train_tea --> ./logs/cont_train_tea_0718_175249 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 137/137 [00:22<00:00, 5.98it/s] 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 915/915 [00:27<00:00, 32.84it/s] /root/miniconda3/lib/python3.8/site-packages/torch/optim/lr_scheduler.py:129: UserWarning: Detected call of lr_scheduler.step() before optimizer.step(). In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate warnings.warn("Detected call of lr_scheduler.step() before optimizer.step(). " 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 138/138 [00:23<00:00, 5.95it/s] e= 0,lr=0.00001000,tl=0.0000,r@1/5/10=29.11/50.29/58.86 *

ramdrop commented 12 months ago

It results from the inappropriate margin parameter, which leads to training step skipping.

To train contrastive setting, you may want to run python main.py --phase=train_tea --loss=cont --margin=0.4.

Note that as our Contrastive and Quadruplet code have not been fully cleaned and archived, we suggest you try Triplet setting.