SwinTransformer / Transformer-SSL

This is an official implementation for "Self-Supervised Learning with Swin Transformers".
https://arxiv.org/abs/2105.04553
MIT License
624 stars 67 forks source link

Strange output log #2

Open launchauto opened 3 years ago

launchauto commented 3 years ago

Hi authors, I have pretrianed your moby_swin_tiny model using 8 Tesla V100 GPU and reproduced your results in downstream task. I get 74.394% on linear evaluation and 43.1% on COCO object detection task, 39.3% on COCO segmentation task. But the loss and grad_norm is really weired during training. Can you show me your log? Here is my log. The loss drops to 7 and then rises to 16, then never drop again. During the pretraining task, the grad norm average value sometimes rises to infinite. log_rank0.txt

launchauto commented 3 years ago

The uploaded txt log_rank0.txt is one of the eight gpus pretrain logs. And the uploaded txt log_rank7.txt is one of the eight gpus linear evaluation logs. log_rank7.txt

michuanhaohao commented 3 years ago

I also encountered the same problem.

tbup commented 2 years ago

@launchauto @michuanhaohao me too, but I run it with precision O0. Did you run with the O0 precision? log_rank0.txt

Rocky1salady-killer commented 2 years ago

我也遇到了这个问题!loss一直是16永远不会下降?

Rocky1salady-killer commented 2 years ago

怎么才能不适用apex混合精度呢?我使用swin transformer进行训练的时候,loss就会下降并且收敛。然而,我注意到swin transformer工程当中没有使用apex混合精度

Chengyang852 commented 1 year ago

Is it normal for the loss value to be around 16? Has anyone encountered this problem?

NonTerraePlusUltra commented 1 year ago

怎么才能不适合用apex混合精度呢?我用swin transformer进行训练的时候,loss就会下降并收敛。不过,我注意到swin transformer工程中没有使用apex混

请问您的问题解决了吗

NonTerraePlusUltra commented 1 year ago

loss值在16左右正常吗?有没有人遇到过这个问题?

loss值在16左右正常吗?有没有人遇到过这个问题?

我也是

Pang-b0 commented 1 year ago

Excuse me, have you solved the problem that loss drops to 8.9 and then rises in the opposite direction? Is it caused by apex mixed precision training?

NonTerraePlusUltra commented 1 year ago

请问,loss下降到8.9然后反方向上升的问题解决了吗?是顶点混合精度训练导致的吗?

没有/(ㄒoㄒ)/~~

Pang-b0 commented 1 year ago

会不会是loss函数的问题呀 这个代码你还在关注吗,我的loss从开始就是16 降不下去

NonTerraePlusUltra commented 1 year ago

不会是loss随便数的问题呀这个代号你还在关注吗,我的loss从开始就是16降不下

我也没有解决。。