AILab-CVC / YOLO-World

[CVPR 2024] Real-Time Open-Vocabulary Object Detection
https://www.yoloworld.cc
GNU General Public License v3.0
4.22k stars 411 forks source link

finetune coco loss 从350多降到了250多,但是bbox_mAP_copypaste: 0.017 0.068 0.001 0.001 0.003 0.041 ,可能什么原因呀? #134

Open grainw opened 5 months ago

grainw commented 5 months ago

采用的是yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco.py这个配置,预训练权重,yolo_world_l_clip_base_dual_vlpan_2e-3adamw_32xb16_100e_o365_goldg_cc3mlite_train_pretrained-7a5eea3b.pth,采用6 块a800,batch 为16,其它未变

wondervictor commented 5 months ago

您好 @grainw 有做其他的修改吗

grainw commented 5 months ago

没有做其它的修改,差异点是用了6块卡

grainw commented 5 months ago

Loads checkpoint by local backend from path: pretrained_models/yolo_world_l_clip_base_dual_vlpan_2e-3adamw_32xb16_100e_o365_goldg_cc3mlite_train_pretrained-7a5eea3b.pth

The model and loaded state dict do not match exactly

unexpected key in source state_dict: neck.text_enhancer.projections.0.conv.weight, neck.text_enhancer.projections.0.conv.bias, neck.text_enhancer.projections.1.conv.weight, neck.text_enhancer.projections.1.conv.bias, neck.text_enhancer.projections.2.conv.weight, neck.text_enhancer.projections.2.conv.bias, neck.text_enhancer.query.0.weight, neck.text_enhancer.query.0.bias, neck.text_enhancer.query.1.weight, neck.text_enhancer.query.1.bias, neck.text_enhancer.key.0.weight, neck.text_enhancer.key.0.bias, neck.text_enhancer.key.1.weight, neck.text_enhancer.key.1.bias, neck.text_enhancer.value.0.weight, neck.text_enhancer.value.0.bias, neck.text_enhancer.value.1.weight, neck.text_enhancer.value.1.bias, neck.text_enhancer.proj.weight, neck.text_enhancer.proj.bias, neck.top_down_layers.0.attn_block.bias, neck.top_down_layers.0.attn_block.guide_fc.weight, neck.top_down_layers.0.attn_block.guide_fc.bias, neck.top_down_layers.1.attn_block.bias, neck.top_down_layers.1.attn_block.guide_fc.weight, neck.top_down_layers.1.attn_block.guide_fc.bias, neck.bottom_up_layers.0.attn_block.bias, neck.bottom_up_layers.0.attn_block.guide_fc.weight, neck.bottom_up_layers.0.attn_block.guide_fc.bias, neck.bottom_up_layers.1.attn_block.bias, neck.bottom_up_layers.1.attn_block.guide_fc.weight, neck.bottom_up_layers.1.attn_block.guide_fc.bias

python3.8/site-packages/torch/functional.py:568: UserWarning: torch.meshgrid: in an upcoming release, it will be required to pass the indexing argument. (Triggered internally at /opt/conda/conda-bld/pytorch_1646755903507/work/aten/src/ATen/native/TensorShape.cpp:2228.) return _VF.meshgrid(tensors, **kwargs) # type: ignore[attr-defined] 03/15 11:45:29 - mmengine - INFO - Epoch(train) [1][ 50/1233] base_lr: 2.0000e-04 lr: 2.6494e-06 eta: 1 day, 3:58:54 time: 1.0218 data_time: 0.0635 memory: 40857 grad_norm: nan loss: 380.8421 loss_cls: 153.0965 loss_bbox: 109.3013 loss_dfl: 118.4443 03/15 11:45:54 - mmengine - INFO - Epoch(train) [1][ 100/1233] base_lr: 2.0000e-04 lr: 5.3528e-06 eta: 20:38:25 time: 0.4864 data_time: 0.0051 memory: 11390 grad_norm: inf loss: 365.0228 loss_cls: 141.9976 loss_bbox: 106.1955 loss_dfl: 116.8297 03/15 11:46:18 - mmengine - INFO - Epoch(train) [1][ 150/1233] base_lr: 2.0000e-04 lr: 8.0562e-06 eta: 18:10:17 time: 0.4845 data_time: 0.0049 memory: 11417 grad_norm: 554.3548 loss: 348.4887 loss_cls: 131.1138 loss_bbox: 101.9342 loss_dfl: 115.4406 03/15 11:46:42 - mmengine - INFO - Epoch(train) [1][ 200/1233] base_lr: 2.0000e-04 lr: 1.0760e-05 eta: 16:57:10 time: 0.4873 data_time: 0.0048 memory: 11443 grad_norm: 545.1316 loss: 348.9414 loss_cls: 133.1653 loss_bbox: 100.6599 loss_dfl: 115.1162 03/15 11:47:07 - mmengine - INFO - Epoch(train) [1][ 250/1233] base_lr: 2.0000e-04 lr: 1.3463e-05 eta: 16:14:22 time: 0.4910 data_time: 0.0048 memory: 11430 grad_norm: 529.0525 loss: 336.5965 loss_cls: 126.8139 loss_bbox: 97.6826 loss_dfl: 112.1000 03/15 11:47:31 - mmengine - INFO - Epoch(train) [1][ 300/1233] base_lr: 2.0000e-04 lr: 1.6167e-05 eta: 15:43:39 time: 0.4836 data_time: 0.0045 memory: 11630 grad_norm: 517.6128 loss: 329.5110 loss_cls: 123.5677 loss_bbox: 95.6097 loss_dfl: 110.3336 03/15 11:47:55 - mmengine - INFO - Epoch(train) [1][ 350/1233] base_lr: 2.0000e-04 lr: 1.8870e-05 eta: 15:20:44 time: 0.4799 data_time: 0.0047 memory: 12030 grad_norm: 520.9134 loss: 329.5116 loss_cls: 122.7626 loss_bbox: 96.1785 loss_dfl: 110.5705 03/15 11:48:19 - mmengine - INFO - Epoch(train) [1][ 400/1233] base_lr: 2.0000e-04 lr: 2.1573e-05 eta: 15:03:36 time: 0.4806 data_time: 0.0048 memory: 11243 grad_norm: 525.1572 loss: 326.7000 loss_cls: 121.7100 loss_bbox: 94.7074 loss_dfl: 110.2826

这是开始的一些日志

grainw commented 5 months ago

03/15 13:29:13 - mmengine - INFO - Epoch(train) [10][1200/1233] base_lr: 2.0000e-04 lr: 1.8020e-04 eta: 11:56:48 time: 0.5094 data_time: 0.0018 memory: 11248 grad_norm: 558.2875 loss: 309.5877 loss_cls: 108.0707 loss_bbox: 94.4306 loss_dfl: 107.0865 03/15 13:29:29 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_114345 03/15 13:29:30 - mmengine - INFO - Saving checkpoint at 10 epochs 03/15 13:29:40 - mmengine - INFO - Epoch(val) [10][ 50/834] eta: 0:00:18 time: 0.0240 data_time: 0.0008 memory: 11422
03/15 13:29:41 - mmengine - INFO - Epoch(val) [10][100/834] eta: 0:00:17 time: 0.0233 data_time: 0.0003 memory: 1715
03/15 13:29:43 - mmengine - INFO - Epoch(val) [10][150/834] eta: 0:00:16 time: 0.0232 data_time: 0.0003 memory: 1715
03/15 13:29:44 - mmengine - INFO - Epoch(val) [10][200/834] eta: 0:00:14 time: 0.0232 data_time: 0.0003 memory: 1715
03/15 13:29:45 - mmengine - INFO - Epoch(val) [10][250/834] eta: 0:00:13 time: 0.0239 data_time: 0.0003 memory: 1715
03/15 13:29:46 - mmengine - INFO - Epoch(val) [10][300/834] eta: 0:00:12 time: 0.0232 data_time: 0.0003 memory: 1715
03/15 13:29:47 - mmengine - INFO - Epoch(val) [10][350/834] eta: 0:00:11 time: 0.0230 data_time: 0.0003 memory: 1715
03/15 13:29:48 - mmengine - INFO - Epoch(val) [10][400/834] eta: 0:00:10 time: 0.0234 data_time: 0.0003 memory: 1715
03/15 13:29:50 - mmengine - INFO - Epoch(val) [10][450/834] eta: 0:00:08 time: 0.0231 data_time: 0.0003 memory: 1715
03/15 13:29:51 - mmengine - INFO - Epoch(val) [10][500/834] eta: 0:00:07 time: 0.0232 data_time: 0.0003 memory: 1715
03/15 13:29:52 - mmengine - INFO - Epoch(val) [10][550/834] eta: 0:00:06 time: 0.0231 data_time: 0.0003 memory: 1715
03/15 13:29:53 - mmengine - INFO - Epoch(val) [10][600/834] eta: 0:00:05 time: 0.0232 data_time: 0.0003 memory: 1715
03/15 13:29:54 - mmengine - INFO - Epoch(val) [10][650/834] eta: 0:00:04 time: 0.0231 data_time: 0.0003 memory: 1715
03/15 13:29:55 - mmengine - INFO - Epoch(val) [10][700/834] eta: 0:00:03 time: 0.0226 data_time: 0.0003 memory: 1715
03/15 13:29:56 - mmengine - INFO - Epoch(val) [10][750/834] eta: 0:00:01 time: 0.0226 data_time: 0.0003 memory: 1715
03/15 13:29:58 - mmengine - INFO - Epoch(val) [10][800/834] eta: 0:00:00 time: 0.0228 data_time: 0.0003 memory: 1715
03/15 13:30:13 - mmengine - INFO - Evaluating bbox... Loading and preparing results... DONE (t=3.36s) creating index... index created! Running per image evaluation... Evaluate annotation type bbox DONE (t=56.20s). Accumulating evaluation results... DONE (t=13.70s). Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.014 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.060 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.001 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.001 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.003 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.033 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.032 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.049 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.057 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.006 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.028 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.134

@wondervictor 这是10个epoch 之后的日志

wondervictor commented 5 months ago

@grainw 我这边重新训练一下检查一下。

grainw commented 5 months ago

训练时memory: 11417 ,以及the model and loaded state dict do not match exactly,unexpected key in source state_dict: neck.text_enhancer.projections.0.conv.weight,这些是正常的吗

wondervictor commented 5 months ago

训练时memory: 11417 ,以及the model and loaded state dict do not match exactly,unexpected key in source state_dict: neck.text_enhancer.projections.0.conv.weight,这些是正常的吗

这个正常,因为efficient neck的训练有部分参数没有用上

wondervictor commented 5 months ago

@grainw 使用yolo_world_l_clip_base_dual_vlpan_2e-3adamw_32xb16_100e_o365_goldg_train_pretrained-0e566235.pth这个权重呢,我现在重新finetune了5个epoch,目前结果看起来正常:

2024/03/15 16:17:51 - mmengine - INFO - Epoch(train)  [1][ 50/925]  lr: 3.5315e-06  eta: 15:34:07  time: 0.7579  data_time: 0.0800  memory: 26968  grad_norm: nan  loss: 525.4751  loss_cls: 208.8300  loss_bbox: 152.9994  loss_dfl: 163.6457
2024/03/15 16:18:14 - mmengine - INFO - Epoch(train)  [1][100/925]  lr: 7.1351e-06  eta: 12:31:37  time: 0.4626  data_time: 0.0051  memory: 11309  grad_norm: 756.9114  loss: 498.9004  loss_cls: 193.2439  loss_bbox: 145.6830  loss_dfl: 159.9734
2024/03/15 16:18:37 - mmengine - INFO - Epoch(train)  [1][150/925]  lr: 1.0739e-05  eta: 11:32:39  time: 0.4678  data_time: 0.0048  memory: 11589  grad_norm: 657.0900  loss: 461.6638  loss_cls: 173.7163  loss_bbox: 134.9502  loss_dfl: 152.9974
2024/03/15 16:19:01 - mmengine - INFO - Epoch(train)  [1][200/925]  lr: 1.4342e-05  eta: 11:04:05  time: 0.4714  data_time: 0.0045  memory: 11682  grad_norm: 640.3960  loss: 451.6395  loss_cls: 169.3258  loss_bbox: 132.0000  loss_dfl: 150.3137
2024/03/15 16:19:24 - mmengine - INFO - Epoch(train)  [1][250/925]  lr: 1.7946e-05  eta: 10:44:58  time: 0.4640  data_time: 0.0045  memory: 11362  grad_norm: 593.4599  loss: 432.6697  loss_cls: 158.8774  loss_bbox: 126.4006  loss_dfl: 147.3917
2024/03/15 16:19:48 - mmengine - INFO - Epoch(train)  [1][300/925]  lr: 2.1550e-05  eta: 10:33:42  time: 0.4718  data_time: 0.0043  memory: 11402  grad_norm: 638.8074  loss: 428.6906  loss_cls: 156.9521  loss_bbox: 126.3464  loss_dfl: 145.3921
2024/03/15 16:20:11 - mmengine - INFO - Epoch(train)  [1][350/925]  lr: 2.5153e-05  eta: 10:25:35  time: 0.4720  data_time: 0.0046  memory: 11802  grad_norm: 629.0559  loss: 422.8995  loss_cls: 155.7669  loss_bbox: 123.3008  loss_dfl: 143.8317
2024/03/15 16:20:34 - mmengine - INFO - Epoch(train)  [1][400/925]  lr: 2.8757e-05  eta: 10:18:06  time: 0.4637  data_time: 0.0046  memory: 11483  grad_norm: 642.9750  loss: 426.3928  loss_cls: 156.6082  loss_bbox: 124.9326  loss_dfl: 144.8519
2024/03/15 16:20:58 - mmengine - INFO - Epoch(train)  [1][450/925]  lr: 3.2360e-05  eta: 10:13:03  time: 0.4698  data_time: 0.0046  memory: 11390  grad_norm: 629.0176  loss: 422.2897  loss_cls: 153.1396  loss_bbox: 125.3907  loss_dfl: 143.7594
2024/03/15 16:21:21 - mmengine - INFO - Epoch(train)  [1][500/925]  lr: 3.5964e-05  eta: 10:09:18  time: 0.4729  data_time: 0.0043  memory: 11643  grad_norm: 642.1543  loss: 426.3616  loss_cls: 155.1986  loss_bbox: 126.2509  loss_dfl: 144.9121
2024/03/15 16:21:45 - mmengine - INFO - Epoch(train)  [1][550/925]  lr: 3.9568e-05  eta: 10:06:10  time: 0.4730  data_time: 0.0048  memory: 11830  grad_norm: 655.0704  loss: 418.5858  loss_cls: 151.4751  loss_bbox: 124.1011  loss_dfl: 143.0096
2024/03/15 16:22:08 - mmengine - INFO - Epoch(train)  [1][600/925]  lr: 4.3171e-05  eta: 10:02:32  time: 0.4636  data_time: 0.0048  memory: 11804  grad_norm: 644.9371  loss: 413.4443  loss_cls: 150.7068  loss_bbox: 120.4227  loss_dfl: 142.3147
2024/03/15 16:22:32 - mmengine - INFO - Epoch(train)  [1][650/925]  lr: 4.6775e-05  eta: 10:00:05  time: 0.4708  data_time: 0.0047  memory: 11498  grad_norm: 658.4046  loss: 425.0935  loss_cls: 155.0833  loss_bbox: 125.7334  loss_dfl: 144.2768
2024/03/15 16:22:56 - mmengine - INFO - Epoch(train)  [1][700/925]  lr: 5.0378e-05  eta: 9:58:07  time: 0.4731  data_time: 0.0043  memory: 11218  grad_norm: 644.3471  loss: 419.8270  loss_cls: 152.7837  loss_bbox: 123.3088  loss_dfl: 143.7345
2024/03/15 16:23:19 - mmengine - INFO - Epoch(train)  [1][750/925]  lr: 5.3982e-05  eta: 9:55:46  time: 0.4658  data_time: 0.0047  memory: 11698  grad_norm: 650.2902  loss: 424.6319  loss_cls: 153.7233  loss_bbox: 127.0469  loss_dfl: 143.8616
2024/03/15 16:23:43 - mmengine - INFO - Epoch(train)  [1][800/925]  lr: 5.7586e-05  eta: 9:54:13  time: 0.4730  data_time: 0.0047  memory: 11498  grad_norm: 676.6561  loss: 412.0664  loss_cls: 147.0775  loss_bbox: 122.7685  loss_dfl: 142.2203
2024/03/15 16:24:06 - mmengine - INFO - Epoch(train)  [1][850/925]  lr: 6.1189e-05  eta: 9:52:51  time: 0.4736  data_time: 0.0047  memory: 11324  grad_norm: 725.5823  loss: 416.5763  loss_cls: 151.3038  loss_bbox: 122.5043  loss_dfl: 142.7682
2024/03/15 16:24:30 - mmengine - INFO - Epoch(train)  [1][900/925]  lr: 6.4793e-05  eta: 9:51:05  time: 0.4660  data_time: 0.0046  memory: 11178  grad_norm: 691.6883  loss: 414.5879  loss_cls: 149.5996  loss_bbox: 123.2322  loss_dfl: 141.7560
2024/03/15 16:24:47 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:25:15 - mmengine - INFO - Epoch(train)  [2][ 50/925]  lr: 6.9329e-05  eta: 10:01:00  time: 0.5402  data_time: 0.0571  memory: 19445  grad_norm: inf  loss: 407.0161  loss_cls: 145.5086  loss_bbox: 120.9010  loss_dfl: 140.6065
2024/03/15 16:25:27 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:25:38 - mmengine - INFO - Epoch(train)  [2][100/925]  lr: 7.2889e-05  eta: 9:59:26  time: 0.4742  data_time: 0.0047  memory: 11684  grad_norm: 689.7203  loss: 415.4700  loss_cls: 150.6841  loss_bbox: 122.6313  loss_dfl: 142.1546
2024/03/15 16:26:02 - mmengine - INFO - Epoch(train)  [2][150/925]  lr: 7.6448e-05  eta: 9:57:22  time: 0.4636  data_time: 0.0046  memory: 11524  grad_norm: 665.1685  loss: 415.2065  loss_cls: 150.1455  loss_bbox: 122.5761  loss_dfl: 142.4849
2024/03/15 16:26:25 - mmengine - INFO - Epoch(train)  [2][200/925]  lr: 8.0007e-05  eta: 9:55:49  time: 0.4704  data_time: 0.0046  memory: 11524  grad_norm: 728.3835  loss: 415.3687  loss_cls: 149.7405  loss_bbox: 123.5458  loss_dfl: 142.0824
2024/03/15 16:26:49 - mmengine - INFO - Epoch(train)  [2][250/925]  lr: 8.3566e-05  eta: 9:54:25  time: 0.4715  data_time: 0.0046  memory: 11151  grad_norm: 700.3310  loss: 410.9091  loss_cls: 146.8674  loss_bbox: 121.8705  loss_dfl: 142.1712
2024/03/15 16:27:12 - mmengine - INFO - Epoch(train)  [2][300/925]  lr: 8.7125e-05  eta: 9:53:17  time: 0.4748  data_time: 0.0045  memory: 11484  grad_norm: 726.3016  loss: 421.8650  loss_cls: 153.4554  loss_bbox: 125.3028  loss_dfl: 143.1068
2024/03/15 16:27:36 - mmengine - INFO - Epoch(train)  [2][350/925]  lr: 9.0684e-05  eta: 9:51:37  time: 0.4630  data_time: 0.0045  memory: 11298  grad_norm: 700.8366  loss: 412.1020  loss_cls: 149.4360  loss_bbox: 121.0278  loss_dfl: 141.6382
2024/03/15 16:27:59 - mmengine - INFO - Epoch(train)  [2][400/925]  lr: 9.4243e-05  eta: 9:50:31  time: 0.4727  data_time: 0.0045  memory: 12298  grad_norm: 766.4367  loss: 409.5823  loss_cls: 144.5384  loss_bbox: 121.7848  loss_dfl: 143.2591
2024/03/15 16:28:23 - mmengine - INFO - Epoch(train)  [2][450/925]  lr: 9.7802e-05  eta: 9:49:36  time: 0.4762  data_time: 0.0047  memory: 11804  grad_norm: 737.8565  loss: 417.8665  loss_cls: 152.1701  loss_bbox: 122.2711  loss_dfl: 143.4254
2024/03/15 16:28:46 - mmengine - INFO - Epoch(train)  [2][500/925]  lr: 1.0136e-04  eta: 9:48:14  time: 0.4643  data_time: 0.0046  memory: 11524  grad_norm: 713.1801  loss: 418.6109  loss_cls: 150.9499  loss_bbox: 123.9508  loss_dfl: 143.7103
2024/03/15 16:29:10 - mmengine - INFO - Epoch(train)  [2][550/925]  lr: 1.0492e-04  eta: 9:47:23  time: 0.4754  data_time: 0.0047  memory: 11377  grad_norm: 771.2182  loss: 409.3603  loss_cls: 147.6367  loss_bbox: 121.4380  loss_dfl: 140.2856
2024/03/15 16:29:34 - mmengine - INFO - Epoch(train)  [2][600/925]  lr: 1.0848e-04  eta: 9:46:27  time: 0.4725  data_time: 0.0047  memory: 11204  grad_norm: 719.4910  loss: 410.1299  loss_cls: 146.3208  loss_bbox: 122.7517  loss_dfl: 141.0574
2024/03/15 16:29:57 - mmengine - INFO - Epoch(train)  [2][650/925]  lr: 1.1204e-04  eta: 9:45:16  time: 0.4654  data_time: 0.0045  memory: 11271  grad_norm: 757.1855  loss: 417.1097  loss_cls: 151.4211  loss_bbox: 122.8683  loss_dfl: 142.8203
2024/03/15 16:30:21 - mmengine - INFO - Epoch(train)  [2][700/925]  lr: 1.1560e-04  eta: 9:44:30  time: 0.4752  data_time: 0.0047  memory: 11297  grad_norm: 738.2053  loss: 413.9922  loss_cls: 149.0836  loss_bbox: 122.0435  loss_dfl: 142.8651
2024/03/15 16:30:44 - mmengine - INFO - Epoch(train)  [2][750/925]  lr: 1.1916e-04  eta: 9:43:39  time: 0.4720  data_time: 0.0047  memory: 11711  grad_norm: 732.4855  loss: 416.8074  loss_cls: 148.4825  loss_bbox: 125.0522  loss_dfl: 143.2727
2024/03/15 16:31:08 - mmengine - INFO - Epoch(train)  [2][800/925]  lr: 1.2271e-04  eta: 9:42:39  time: 0.4672  data_time: 0.0048  memory: 11537  grad_norm: 776.4783  loss: 420.2762  loss_cls: 151.4378  loss_bbox: 125.7740  loss_dfl: 143.0644
2024/03/15 16:31:31 - mmengine - INFO - Epoch(train)  [2][850/925]  lr: 1.2627e-04  eta: 9:41:52  time: 0.4722  data_time: 0.0042  memory: 11124  grad_norm: 703.0178  loss: 416.7561  loss_cls: 151.3379  loss_bbox: 122.4973  loss_dfl: 142.9209
2024/03/15 16:31:55 - mmengine - INFO - Epoch(train)  [2][900/925]  lr: 1.2983e-04  eta: 9:41:05  time: 0.4720  data_time: 0.0047  memory: 11497  grad_norm: 735.8698  loss: 413.9604  loss_cls: 147.8014  loss_bbox: 123.3400  loss_dfl: 142.8190
2024/03/15 16:32:06 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:32:33 - mmengine - INFO - Epoch(train)  [3][ 50/925]  lr: 1.3348e-04  eta: 9:41:32  time: 0.5331  data_time: 0.0558  memory: 11591  grad_norm: 791.2741  loss: 411.4333  loss_cls: 148.5960  loss_bbox: 121.1281  loss_dfl: 141.7092
2024/03/15 16:32:57 - mmengine - INFO - Epoch(train)  [3][100/925]  lr: 1.3699e-04  eta: 9:40:46  time: 0.4724  data_time: 0.0047  memory: 11537  grad_norm: 756.8158  loss: 414.4460  loss_cls: 149.2828  loss_bbox: 122.2934  loss_dfl: 142.8698
2024/03/15 16:33:21 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:33:21 - mmengine - INFO - Epoch(train)  [3][150/925]  lr: 1.4051e-04  eta: 9:40:08  time: 0.4756  data_time: 0.0047  memory: 11457  grad_norm: 741.1044  loss: 413.0435  loss_cls: 146.6127  loss_bbox: 123.5653  loss_dfl: 142.8655
2024/03/15 16:33:44 - mmengine - INFO - Epoch(train)  [3][200/925]  lr: 1.4402e-04  eta: 9:39:10  time: 0.4642  data_time: 0.0045  memory: 11391  grad_norm: 791.1428  loss: 417.2338  loss_cls: 150.0461  loss_bbox: 123.5253  loss_dfl: 143.6624
2024/03/15 16:34:07 - mmengine - INFO - Epoch(train)  [3][250/925]  lr: 1.4754e-04  eta: 9:38:27  time: 0.4723  data_time: 0.0048  memory: 11297  grad_norm: 751.2373  loss: 414.7239  loss_cls: 148.5880  loss_bbox: 123.3747  loss_dfl: 142.7612
2024/03/15 16:34:31 - mmengine - INFO - Epoch(train)  [3][300/925]  lr: 1.5105e-04  eta: 9:37:46  time: 0.4727  data_time: 0.0048  memory: 11364  grad_norm: 752.3860  loss: 402.4461  loss_cls: 143.6677  loss_bbox: 118.4292  loss_dfl: 140.3493
2024/03/15 16:34:55 - mmengine - INFO - Epoch(train)  [3][350/925]  lr: 1.5456e-04  eta: 9:37:06  time: 0.4726  data_time: 0.0049  memory: 11724  grad_norm: 771.6746  loss: 420.1259  loss_cls: 151.5801  loss_bbox: 125.4478  loss_dfl: 143.0981
2024/03/15 16:35:18 - mmengine - INFO - Epoch(train)  [3][400/925]  lr: 1.5808e-04  eta: 9:36:17  time: 0.4666  data_time: 0.0044  memory: 11471  grad_norm: 793.5751  loss: 417.8385  loss_cls: 150.9237  loss_bbox: 123.2844  loss_dfl: 143.6304
2024/03/15 16:35:43 - mmengine - INFO - Epoch(train)  [3][450/925]  lr: 1.6159e-04  eta: 9:36:02  time: 0.4880  data_time: 0.0047  memory: 11457  grad_norm: 764.7687  loss: 420.2951  loss_cls: 151.3505  loss_bbox: 125.7738  loss_dfl: 143.1708
2024/03/15 16:36:06 - mmengine - INFO - Epoch(train)  [3][500/925]  lr: 1.6511e-04  eta: 9:35:32  time: 0.4781  data_time: 0.0049  memory: 11697  grad_norm: 776.4341  loss: 421.6394  loss_cls: 152.0562  loss_bbox: 125.8199  loss_dfl: 143.7634
2024/03/15 16:36:30 - mmengine - INFO - Epoch(train)  [3][550/925]  lr: 1.6862e-04  eta: 9:34:43  time: 0.4653  data_time: 0.0046  memory: 11284  grad_norm: 793.4142  loss: 423.6822  loss_cls: 152.8281  loss_bbox: 126.4344  loss_dfl: 144.4198
2024/03/15 16:36:54 - mmengine - INFO - Epoch(train)  [3][600/925]  lr: 1.7214e-04  eta: 9:34:14  time: 0.4780  data_time: 0.0047  memory: 11177  grad_norm: inf  loss: 414.4807  loss_cls: 149.9818  loss_bbox: 122.4875  loss_dfl: 142.0114
2024/03/15 16:37:18 - mmengine - INFO - Epoch(train)  [3][650/925]  lr: 1.7565e-04  eta: 9:33:46  time: 0.4789  data_time: 0.0045  memory: 11484  grad_norm: 752.1249  loss: 418.3147  loss_cls: 148.4957  loss_bbox: 125.9196  loss_dfl: 143.8994
2024/03/15 16:37:42 - mmengine - INFO - Epoch(train)  [3][700/925]  lr: 1.7916e-04  eta: 9:33:22  time: 0.4813  data_time: 0.0049  memory: 11537  grad_norm: 806.5516  loss: 425.4617  loss_cls: 152.4550  loss_bbox: 128.0599  loss_dfl: 144.9468
2024/03/15 16:38:05 - mmengine - INFO - Epoch(train)  [3][750/925]  lr: 1.8268e-04  eta: 9:32:43  time: 0.4706  data_time: 0.0047  memory: 11591  grad_norm: 788.5291  loss: 418.3670  loss_cls: 151.0146  loss_bbox: 123.5574  loss_dfl: 143.7950
2024/03/15 16:38:29 - mmengine - INFO - Epoch(train)  [3][800/925]  lr: 1.8619e-04  eta: 9:32:16  time: 0.4791  data_time: 0.0048  memory: 11244  grad_norm: 781.7332  loss: 421.9915  loss_cls: 152.4264  loss_bbox: 125.2268  loss_dfl: 144.3383
2024/03/15 16:38:53 - mmengine - INFO - Epoch(train)  [3][850/925]  lr: 1.8971e-04  eta: 9:31:52  time: 0.4819  data_time: 0.0046  memory: 11177  grad_norm: 781.4488  loss: 418.2716  loss_cls: 150.7085  loss_bbox: 123.8938  loss_dfl: 143.6692
2024/03/15 16:39:17 - mmengine - INFO - Epoch(train)  [3][900/925]  lr: 1.9322e-04  eta: 9:31:09  time: 0.4660  data_time: 0.0046  memory: 11591  grad_norm: 743.8437  loss: 420.1414  loss_cls: 152.0295  loss_bbox: 124.6641  loss_dfl: 143.4478
2024/03/15 16:39:28 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:39:56 - mmengine - INFO - Epoch(train)  [4][ 50/925]  lr: 1.9258e-04  eta: 9:31:38  time: 0.5408  data_time: 0.0514  memory: 11377  grad_norm: 846.3355  loss: 413.7974  loss_cls: 148.8526  loss_bbox: 123.2204  loss_dfl: 141.7244
2024/03/15 16:40:19 - mmengine - INFO - Epoch(train)  [4][100/925]  lr: 1.9258e-04  eta: 9:30:59  time: 0.4700  data_time: 0.0018  memory: 11311  grad_norm: 840.2444  loss: 424.9210  loss_cls: 154.1513  loss_bbox: 126.0876  loss_dfl: 144.6821
2024/03/15 16:40:43 - mmengine - INFO - Epoch(train)  [4][150/925]  lr: 1.9258e-04  eta: 9:30:21  time: 0.4701  data_time: 0.0018  memory: 11617  grad_norm: 757.9608  loss: 427.7514  loss_cls: 155.5139  loss_bbox: 127.5539  loss_dfl: 144.6835
2024/03/15 16:41:07 - mmengine - INFO - Epoch(train)  [4][200/925]  lr: 1.9258e-04  eta: 9:29:56  time: 0.4808  data_time: 0.0018  memory: 11497  grad_norm: 738.7337  loss: 427.7940  loss_cls: 154.4997  loss_bbox: 127.3853  loss_dfl: 145.9090
2024/03/15 16:41:19 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:41:31 - mmengine - INFO - Epoch(train)  [4][250/925]  lr: 1.9258e-04  eta: 9:29:31  time: 0.4809  data_time: 0.0018  memory: 11351  grad_norm: 724.3692  loss: 423.4680  loss_cls: 153.6985  loss_bbox: 125.8745  loss_dfl: 143.8950
2024/03/15 16:41:54 - mmengine - INFO - Epoch(train)  [4][300/925]  lr: 1.9258e-04  eta: 9:28:52  time: 0.4683  data_time: 0.0017  memory: 11364  grad_norm: 802.7019  loss: 426.9299  loss_cls: 154.8474  loss_bbox: 126.6810  loss_dfl: 145.4015
2024/03/15 16:42:18 - mmengine - INFO - Epoch(train)  [4][350/925]  lr: 1.9258e-04  eta: 9:28:23  time: 0.4770  data_time: 0.0018  memory: 11471  grad_norm: 773.6626  loss: 418.8825  loss_cls: 151.5354  loss_bbox: 123.5200  loss_dfl: 143.8271
2024/03/15 16:42:42 - mmengine - INFO - Epoch(train)  [4][400/925]  lr: 1.9258e-04  eta: 9:27:52  time: 0.4748  data_time: 0.0018  memory: 11444  grad_norm: 797.0045  loss: 422.8512  loss_cls: 151.5049  loss_bbox: 126.9762  loss_dfl: 144.3701
2024/03/15 16:43:05 - mmengine - INFO - Epoch(train)  [4][450/925]  lr: 1.9258e-04  eta: 9:27:08  time: 0.4629  data_time: 0.0019  memory: 11431  grad_norm: 765.5104  loss: 417.1834  loss_cls: 151.3039  loss_bbox: 123.5429  loss_dfl: 142.3367
2024/03/15 16:43:29 - mmengine - INFO - Epoch(train)  [4][500/925]  lr: 1.9258e-04  eta: 9:26:42  time: 0.4789  data_time: 0.0018  memory: 11164  grad_norm: 748.9262  loss: 425.9064  loss_cls: 155.7105  loss_bbox: 125.1608  loss_dfl: 145.0351
2024/03/15 16:43:53 - mmengine - INFO - Epoch(train)  [4][550/925]  lr: 1.9258e-04  eta: 9:26:13  time: 0.4764  data_time: 0.0017  memory: 11471  grad_norm: 696.1131  loss: 424.1156  loss_cls: 154.1871  loss_bbox: 125.6517  loss_dfl: 144.2768
2024/03/15 16:44:16 - mmengine - INFO - Epoch(train)  [4][600/925]  lr: 1.9258e-04  eta: 9:25:36  time: 0.4681  data_time: 0.0019  memory: 11071  grad_norm: 732.9221  loss: 420.6391  loss_cls: 151.0640  loss_bbox: 125.7588  loss_dfl: 143.8164
2024/03/15 16:44:40 - mmengine - INFO - Epoch(train)  [4][650/925]  lr: 1.9258e-04  eta: 9:25:09  time: 0.4776  data_time: 0.0019  memory: 11604  grad_norm: 756.3554  loss: 418.5091  loss_cls: 150.6214  loss_bbox: 125.3326  loss_dfl: 142.5551
2024/03/15 16:45:04 - mmengine - INFO - Epoch(train)  [4][700/925]  lr: 1.9258e-04  eta: 9:24:42  time: 0.4777  data_time: 0.0019  memory: 11244  grad_norm: 761.7258  loss: 419.0587  loss_cls: 151.2744  loss_bbox: 123.7322  loss_dfl: 144.0520
2024/03/15 16:45:28 - mmengine - INFO - Epoch(train)  [4][750/925]  lr: 1.9258e-04  eta: 9:24:14  time: 0.4766  data_time: 0.0019  memory: 11364  grad_norm: 745.9792  loss: 420.9096  loss_cls: 150.0314  loss_bbox: 126.7636  loss_dfl: 144.1146
2024/03/15 16:45:51 - mmengine - INFO - Epoch(train)  [4][800/925]  lr: 1.9258e-04  eta: 9:23:34  time: 0.4634  data_time: 0.0018  memory: 11311  grad_norm: 736.6421  loss: 424.5623  loss_cls: 153.1537  loss_bbox: 127.2009  loss_dfl: 144.2077
2024/03/15 16:46:16 - mmengine - INFO - Epoch(train)  [4][850/925]  lr: 1.9258e-04  eta: 9:23:21  time: 0.4915  data_time: 0.0018  memory: 11297  grad_norm: 776.4159  loss: 417.8380  loss_cls: 149.0427  loss_bbox: 125.2182  loss_dfl: 143.5771
2024/03/15 16:46:40 - mmengine - INFO - Epoch(train)  [4][900/925]  lr: 1.9258e-04  eta: 9:22:58  time: 0.4823  data_time: 0.0018  memory: 11377  grad_norm: 787.5051  loss: 416.1993  loss_cls: 148.7611  loss_bbox: 123.3235  loss_dfl: 144.1147
2024/03/15 16:46:51 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:47:18 - mmengine - INFO - Epoch(train)  [5][ 50/925]  lr: 1.9258e-04  eta: 9:23:08  time: 0.5472  data_time: 0.0472  memory: 11377  grad_norm: 711.5931  loss: 424.6482  loss_cls: 153.5620  loss_bbox: 127.1742  loss_dfl: 143.9120
2024/03/15 16:47:43 - mmengine - INFO - Epoch(train)  [5][100/925]  lr: 1.9258e-04  eta: 9:22:50  time: 0.4883  data_time: 0.0018  memory: 11271  grad_norm: 741.8803  loss: 418.2314  loss_cls: 151.1732  loss_bbox: 123.3328  loss_dfl: 143.7254
2024/03/15 16:48:07 - mmengine - INFO - Epoch(train)  [5][150/925]  lr: 1.9258e-04  eta: 9:22:33  time: 0.4879  data_time: 0.0018  memory: 11324  grad_norm: 725.5641  loss: 416.2239  loss_cls: 149.4518  loss_bbox: 123.3476  loss_dfl: 143.4245
2024/03/15 16:48:31 - mmengine - INFO - Epoch(train)  [5][200/925]  lr: 1.9258e-04  eta: 9:21:56  time: 0.4678  data_time: 0.0018  memory: 11337  grad_norm: 757.6571  loss: 427.3945  loss_cls: 154.3230  loss_bbox: 126.9904  loss_dfl: 146.0810
2024/03/15 16:48:55 - mmengine - INFO - Epoch(train)  [5][250/925]  lr: 1.9258e-04  eta: 9:21:39  time: 0.4880  data_time: 0.0018  memory: 11257  grad_norm: 718.5845  loss: 418.5219  loss_cls: 149.8339  loss_bbox: 125.5260  loss_dfl: 143.1619
2024/03/15 16:49:20 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:49:20 - mmengine - INFO - Epoch(train)  [5][300/925]  lr: 1.9258e-04  eta: 9:21:24  time: 0.4915  data_time: 0.0018  memory: 11391  grad_norm: 746.0813  loss: 417.2469  loss_cls: 149.4160  loss_bbox: 123.9316  loss_dfl: 143.8993
2024/03/15 16:49:43 - mmengine - INFO - Epoch(train)  [5][350/925]  lr: 1.9258e-04  eta: 9:20:54  time: 0.4751  data_time: 0.0019  memory: 11231  grad_norm: 786.6009  loss: 416.9541  loss_cls: 149.2718  loss_bbox: 124.7095  loss_dfl: 142.9729
2024/03/15 16:50:08 - mmengine - INFO - Epoch(train)  [5][400/925]  lr: 1.9258e-04  eta: 9:20:39  time: 0.4910  data_time: 0.0018  memory: 11377  grad_norm: 778.6423  loss: 424.1211  loss_cls: 153.0846  loss_bbox: 126.5452  loss_dfl: 144.4913
2024/03/15 16:50:32 - mmengine - INFO - Epoch(train)  [5][450/925]  lr: 1.9258e-04  eta: 9:20:21  time: 0.4885  data_time: 0.0018  memory: 11484  grad_norm: 701.9506  loss: 417.6811  loss_cls: 150.0850  loss_bbox: 124.3821  loss_dfl: 143.2140
2024/03/15 16:50:57 - mmengine - INFO - Epoch(train)  [5][500/925]  lr: 1.9258e-04  eta: 9:19:57  time: 0.4821  data_time: 0.0020  memory: 11844  grad_norm: 744.3425  loss: 422.0160  loss_cls: 150.9617  loss_bbox: 125.3786  loss_dfl: 145.6757
2024/03/15 16:51:20 - mmengine - INFO - Epoch(train)  [5][550/925]  lr: 1.9258e-04  eta: 9:19:30  time: 0.4776  data_time: 0.0018  memory: 11337  grad_norm: 749.4043  loss: 425.6322  loss_cls: 151.7524  loss_bbox: 127.7883  loss_dfl: 146.0916
2024/03/15 16:51:45 - mmengine - INFO - Epoch(train)  [5][600/925]  lr: 1.9258e-04  eta: 9:19:15  time: 0.4924  data_time: 0.0019  memory: 11284  grad_norm: 755.8023  loss: 414.8110  loss_cls: 147.3276  loss_bbox: 123.4438  loss_dfl: 144.0396
2024/03/15 16:52:10 - mmengine - INFO - Epoch(train)  [5][650/925]  lr: 1.9258e-04  eta: 9:18:58  time: 0.4902  data_time: 0.0018  memory: 11484  grad_norm: 813.1651  loss: 418.8702  loss_cls: 148.6041  loss_bbox: 126.1573  loss_dfl: 144.1087
2024/03/15 16:52:33 - mmengine - INFO - Epoch(train)  [5][700/925]  lr: 1.9258e-04  eta: 9:18:28  time: 0.4735  data_time: 0.0018  memory: 11444  grad_norm: 755.9740  loss: 422.6757  loss_cls: 151.6788  loss_bbox: 125.4517  loss_dfl: 145.5452
2024/03/15 16:52:58 - mmengine - INFO - Epoch(train)  [5][750/925]  lr: 1.9258e-04  eta: 9:18:07  time: 0.4861  data_time: 0.0018  memory: 11751  grad_norm: 685.1831  loss: 417.5276  loss_cls: 148.6724  loss_bbox: 125.7465  loss_dfl: 143.1088
2024/03/15 16:53:22 - mmengine - INFO - Epoch(train)  [5][800/925]  lr: 1.9258e-04  eta: 9:17:50  time: 0.4902  data_time: 0.0018  memory: 11177  grad_norm: 824.9883  loss: 418.6440  loss_cls: 148.8286  loss_bbox: 125.4005  loss_dfl: 144.4149
2024/03/15 16:53:46 - mmengine - INFO - Epoch(train)  [5][850/925]  lr: 1.9258e-04  eta: 9:17:16  time: 0.4686  data_time: 0.0017  memory: 11471  grad_norm: 742.1786  loss: 420.7426  loss_cls: 151.2453  loss_bbox: 125.8574  loss_dfl: 143.6399
2024/03/15 16:54:10 - mmengine - INFO - Epoch(train)  [5][900/925]  lr: 1.9258e-04  eta: 9:16:57  time: 0.4889  data_time: 0.0018  memory: 11511  grad_norm: 708.2669  loss: 424.4981  loss_cls: 151.7956  loss_bbox: 127.1316  loss_dfl: 145.5709
2024/03/15 16:54:22 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:54:22 - mmengine - INFO - Saving checkpoint at 5 epochs
2024/03/15 16:54:25 - mmengine - WARNING - `save_param_scheduler` is True but `self.param_schedulers` is None, so skip saving parameter schedulers
2024/03/15 16:54:33 - mmengine - INFO - Epoch(val)  [5][ 50/625]    eta: 0:00:48  time: 0.0836  data_time: 0.0042  memory: 14329  
2024/03/15 16:54:35 - mmengine - INFO - Epoch(val)  [5][100/625]    eta: 0:00:36  time: 0.0541  data_time: 0.0003  memory: 1696  
2024/03/15 16:54:38 - mmengine - INFO - Epoch(val)  [5][150/625]    eta: 0:00:30  time: 0.0522  data_time: 0.0002  memory: 1696  
2024/03/15 16:54:41 - mmengine - INFO - Epoch(val)  [5][200/625]    eta: 0:00:25  time: 0.0496  data_time: 0.0003  memory: 1696  
2024/03/15 16:54:43 - mmengine - INFO - Epoch(val)  [5][250/625]    eta: 0:00:21  time: 0.0532  data_time: 0.0003  memory: 1696  
2024/03/15 16:54:46 - mmengine - INFO - Epoch(val)  [5][300/625]    eta: 0:00:18  time: 0.0566  data_time: 0.0003  memory: 1696  
2024/03/15 16:54:49 - mmengine - INFO - Epoch(val)  [5][350/625]    eta: 0:00:15  time: 0.0503  data_time: 0.0002  memory: 1696  
2024/03/15 16:54:51 - mmengine - INFO - Epoch(val)  [5][400/625]    eta: 0:00:12  time: 0.0552  data_time: 0.0002  memory: 1696  
2024/03/15 16:54:54 - mmengine - INFO - Epoch(val)  [5][450/625]    eta: 0:00:09  time: 0.0533  data_time: 0.0003  memory: 1696  
2024/03/15 16:54:56 - mmengine - INFO - Epoch(val)  [5][500/625]    eta: 0:00:06  time: 0.0503  data_time: 0.0002  memory: 1696  
2024/03/15 16:54:59 - mmengine - INFO - Epoch(val)  [5][550/625]    eta: 0:00:04  time: 0.0552  data_time: 0.0003  memory: 1696  
2024/03/15 16:55:02 - mmengine - INFO - Epoch(val)  [5][600/625]    eta: 0:00:01  time: 0.0560  data_time: 0.0003  memory: 1696  
2024/03/15 16:55:28 - mmengine - INFO - Evaluating bbox...
2024/03/15 16:57:24 - mmengine - INFO - bbox_mAP_copypaste: 0.411 0.563 0.449 0.252 0.461 0.534
KDgggg commented 5 months ago

@grainw 使用yolo_world_l_clip_base_dual_vlpan_2e-3adamw_32xb16_100e_o365_goldg_train_pretrained-0e566235.pth这个权重呢,我现在重新finetune了5个epoch,目前结果看起来正常:

2024/03/15 16:17:51 - mmengine - INFO - Epoch(train)  [1][ 50/925]  lr: 3.5315e-06  eta: 15:34:07  time: 0.7579  data_time: 0.0800  memory: 26968  grad_norm: nan  loss: 525.4751  loss_cls: 208.8300  loss_bbox: 152.9994  loss_dfl: 163.6457
2024/03/15 16:18:14 - mmengine - INFO - Epoch(train)  [1][100/925]  lr: 7.1351e-06  eta: 12:31:37  time: 0.4626  data_time: 0.0051  memory: 11309  grad_norm: 756.9114  loss: 498.9004  loss_cls: 193.2439  loss_bbox: 145.6830  loss_dfl: 159.9734
2024/03/15 16:18:37 - mmengine - INFO - Epoch(train)  [1][150/925]  lr: 1.0739e-05  eta: 11:32:39  time: 0.4678  data_time: 0.0048  memory: 11589  grad_norm: 657.0900  loss: 461.6638  loss_cls: 173.7163  loss_bbox: 134.9502  loss_dfl: 152.9974
2024/03/15 16:19:01 - mmengine - INFO - Epoch(train)  [1][200/925]  lr: 1.4342e-05  eta: 11:04:05  time: 0.4714  data_time: 0.0045  memory: 11682  grad_norm: 640.3960  loss: 451.6395  loss_cls: 169.3258  loss_bbox: 132.0000  loss_dfl: 150.3137
2024/03/15 16:19:24 - mmengine - INFO - Epoch(train)  [1][250/925]  lr: 1.7946e-05  eta: 10:44:58  time: 0.4640  data_time: 0.0045  memory: 11362  grad_norm: 593.4599  loss: 432.6697  loss_cls: 158.8774  loss_bbox: 126.4006  loss_dfl: 147.3917
2024/03/15 16:19:48 - mmengine - INFO - Epoch(train)  [1][300/925]  lr: 2.1550e-05  eta: 10:33:42  time: 0.4718  data_time: 0.0043  memory: 11402  grad_norm: 638.8074  loss: 428.6906  loss_cls: 156.9521  loss_bbox: 126.3464  loss_dfl: 145.3921
2024/03/15 16:20:11 - mmengine - INFO - Epoch(train)  [1][350/925]  lr: 2.5153e-05  eta: 10:25:35  time: 0.4720  data_time: 0.0046  memory: 11802  grad_norm: 629.0559  loss: 422.8995  loss_cls: 155.7669  loss_bbox: 123.3008  loss_dfl: 143.8317
2024/03/15 16:20:34 - mmengine - INFO - Epoch(train)  [1][400/925]  lr: 2.8757e-05  eta: 10:18:06  time: 0.4637  data_time: 0.0046  memory: 11483  grad_norm: 642.9750  loss: 426.3928  loss_cls: 156.6082  loss_bbox: 124.9326  loss_dfl: 144.8519
2024/03/15 16:20:58 - mmengine - INFO - Epoch(train)  [1][450/925]  lr: 3.2360e-05  eta: 10:13:03  time: 0.4698  data_time: 0.0046  memory: 11390  grad_norm: 629.0176  loss: 422.2897  loss_cls: 153.1396  loss_bbox: 125.3907  loss_dfl: 143.7594
2024/03/15 16:21:21 - mmengine - INFO - Epoch(train)  [1][500/925]  lr: 3.5964e-05  eta: 10:09:18  time: 0.4729  data_time: 0.0043  memory: 11643  grad_norm: 642.1543  loss: 426.3616  loss_cls: 155.1986  loss_bbox: 126.2509  loss_dfl: 144.9121
2024/03/15 16:21:45 - mmengine - INFO - Epoch(train)  [1][550/925]  lr: 3.9568e-05  eta: 10:06:10  time: 0.4730  data_time: 0.0048  memory: 11830  grad_norm: 655.0704  loss: 418.5858  loss_cls: 151.4751  loss_bbox: 124.1011  loss_dfl: 143.0096
2024/03/15 16:22:08 - mmengine - INFO - Epoch(train)  [1][600/925]  lr: 4.3171e-05  eta: 10:02:32  time: 0.4636  data_time: 0.0048  memory: 11804  grad_norm: 644.9371  loss: 413.4443  loss_cls: 150.7068  loss_bbox: 120.4227  loss_dfl: 142.3147
2024/03/15 16:22:32 - mmengine - INFO - Epoch(train)  [1][650/925]  lr: 4.6775e-05  eta: 10:00:05  time: 0.4708  data_time: 0.0047  memory: 11498  grad_norm: 658.4046  loss: 425.0935  loss_cls: 155.0833  loss_bbox: 125.7334  loss_dfl: 144.2768
2024/03/15 16:22:56 - mmengine - INFO - Epoch(train)  [1][700/925]  lr: 5.0378e-05  eta: 9:58:07  time: 0.4731  data_time: 0.0043  memory: 11218  grad_norm: 644.3471  loss: 419.8270  loss_cls: 152.7837  loss_bbox: 123.3088  loss_dfl: 143.7345
2024/03/15 16:23:19 - mmengine - INFO - Epoch(train)  [1][750/925]  lr: 5.3982e-05  eta: 9:55:46  time: 0.4658  data_time: 0.0047  memory: 11698  grad_norm: 650.2902  loss: 424.6319  loss_cls: 153.7233  loss_bbox: 127.0469  loss_dfl: 143.8616
2024/03/15 16:23:43 - mmengine - INFO - Epoch(train)  [1][800/925]  lr: 5.7586e-05  eta: 9:54:13  time: 0.4730  data_time: 0.0047  memory: 11498  grad_norm: 676.6561  loss: 412.0664  loss_cls: 147.0775  loss_bbox: 122.7685  loss_dfl: 142.2203
2024/03/15 16:24:06 - mmengine - INFO - Epoch(train)  [1][850/925]  lr: 6.1189e-05  eta: 9:52:51  time: 0.4736  data_time: 0.0047  memory: 11324  grad_norm: 725.5823  loss: 416.5763  loss_cls: 151.3038  loss_bbox: 122.5043  loss_dfl: 142.7682
2024/03/15 16:24:30 - mmengine - INFO - Epoch(train)  [1][900/925]  lr: 6.4793e-05  eta: 9:51:05  time: 0.4660  data_time: 0.0046  memory: 11178  grad_norm: 691.6883  loss: 414.5879  loss_cls: 149.5996  loss_bbox: 123.2322  loss_dfl: 141.7560
2024/03/15 16:24:47 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:25:15 - mmengine - INFO - Epoch(train)  [2][ 50/925]  lr: 6.9329e-05  eta: 10:01:00  time: 0.5402  data_time: 0.0571  memory: 19445  grad_norm: inf  loss: 407.0161  loss_cls: 145.5086  loss_bbox: 120.9010  loss_dfl: 140.6065
2024/03/15 16:25:27 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:25:38 - mmengine - INFO - Epoch(train)  [2][100/925]  lr: 7.2889e-05  eta: 9:59:26  time: 0.4742  data_time: 0.0047  memory: 11684  grad_norm: 689.7203  loss: 415.4700  loss_cls: 150.6841  loss_bbox: 122.6313  loss_dfl: 142.1546
2024/03/15 16:26:02 - mmengine - INFO - Epoch(train)  [2][150/925]  lr: 7.6448e-05  eta: 9:57:22  time: 0.4636  data_time: 0.0046  memory: 11524  grad_norm: 665.1685  loss: 415.2065  loss_cls: 150.1455  loss_bbox: 122.5761  loss_dfl: 142.4849
2024/03/15 16:26:25 - mmengine - INFO - Epoch(train)  [2][200/925]  lr: 8.0007e-05  eta: 9:55:49  time: 0.4704  data_time: 0.0046  memory: 11524  grad_norm: 728.3835  loss: 415.3687  loss_cls: 149.7405  loss_bbox: 123.5458  loss_dfl: 142.0824
2024/03/15 16:26:49 - mmengine - INFO - Epoch(train)  [2][250/925]  lr: 8.3566e-05  eta: 9:54:25  time: 0.4715  data_time: 0.0046  memory: 11151  grad_norm: 700.3310  loss: 410.9091  loss_cls: 146.8674  loss_bbox: 121.8705  loss_dfl: 142.1712
2024/03/15 16:27:12 - mmengine - INFO - Epoch(train)  [2][300/925]  lr: 8.7125e-05  eta: 9:53:17  time: 0.4748  data_time: 0.0045  memory: 11484  grad_norm: 726.3016  loss: 421.8650  loss_cls: 153.4554  loss_bbox: 125.3028  loss_dfl: 143.1068
2024/03/15 16:27:36 - mmengine - INFO - Epoch(train)  [2][350/925]  lr: 9.0684e-05  eta: 9:51:37  time: 0.4630  data_time: 0.0045  memory: 11298  grad_norm: 700.8366  loss: 412.1020  loss_cls: 149.4360  loss_bbox: 121.0278  loss_dfl: 141.6382
2024/03/15 16:27:59 - mmengine - INFO - Epoch(train)  [2][400/925]  lr: 9.4243e-05  eta: 9:50:31  time: 0.4727  data_time: 0.0045  memory: 12298  grad_norm: 766.4367  loss: 409.5823  loss_cls: 144.5384  loss_bbox: 121.7848  loss_dfl: 143.2591
2024/03/15 16:28:23 - mmengine - INFO - Epoch(train)  [2][450/925]  lr: 9.7802e-05  eta: 9:49:36  time: 0.4762  data_time: 0.0047  memory: 11804  grad_norm: 737.8565  loss: 417.8665  loss_cls: 152.1701  loss_bbox: 122.2711  loss_dfl: 143.4254
2024/03/15 16:28:46 - mmengine - INFO - Epoch(train)  [2][500/925]  lr: 1.0136e-04  eta: 9:48:14  time: 0.4643  data_time: 0.0046  memory: 11524  grad_norm: 713.1801  loss: 418.6109  loss_cls: 150.9499  loss_bbox: 123.9508  loss_dfl: 143.7103
2024/03/15 16:29:10 - mmengine - INFO - Epoch(train)  [2][550/925]  lr: 1.0492e-04  eta: 9:47:23  time: 0.4754  data_time: 0.0047  memory: 11377  grad_norm: 771.2182  loss: 409.3603  loss_cls: 147.6367  loss_bbox: 121.4380  loss_dfl: 140.2856
2024/03/15 16:29:34 - mmengine - INFO - Epoch(train)  [2][600/925]  lr: 1.0848e-04  eta: 9:46:27  time: 0.4725  data_time: 0.0047  memory: 11204  grad_norm: 719.4910  loss: 410.1299  loss_cls: 146.3208  loss_bbox: 122.7517  loss_dfl: 141.0574
2024/03/15 16:29:57 - mmengine - INFO - Epoch(train)  [2][650/925]  lr: 1.1204e-04  eta: 9:45:16  time: 0.4654  data_time: 0.0045  memory: 11271  grad_norm: 757.1855  loss: 417.1097  loss_cls: 151.4211  loss_bbox: 122.8683  loss_dfl: 142.8203
2024/03/15 16:30:21 - mmengine - INFO - Epoch(train)  [2][700/925]  lr: 1.1560e-04  eta: 9:44:30  time: 0.4752  data_time: 0.0047  memory: 11297  grad_norm: 738.2053  loss: 413.9922  loss_cls: 149.0836  loss_bbox: 122.0435  loss_dfl: 142.8651
2024/03/15 16:30:44 - mmengine - INFO - Epoch(train)  [2][750/925]  lr: 1.1916e-04  eta: 9:43:39  time: 0.4720  data_time: 0.0047  memory: 11711  grad_norm: 732.4855  loss: 416.8074  loss_cls: 148.4825  loss_bbox: 125.0522  loss_dfl: 143.2727
2024/03/15 16:31:08 - mmengine - INFO - Epoch(train)  [2][800/925]  lr: 1.2271e-04  eta: 9:42:39  time: 0.4672  data_time: 0.0048  memory: 11537  grad_norm: 776.4783  loss: 420.2762  loss_cls: 151.4378  loss_bbox: 125.7740  loss_dfl: 143.0644
2024/03/15 16:31:31 - mmengine - INFO - Epoch(train)  [2][850/925]  lr: 1.2627e-04  eta: 9:41:52  time: 0.4722  data_time: 0.0042  memory: 11124  grad_norm: 703.0178  loss: 416.7561  loss_cls: 151.3379  loss_bbox: 122.4973  loss_dfl: 142.9209
2024/03/15 16:31:55 - mmengine - INFO - Epoch(train)  [2][900/925]  lr: 1.2983e-04  eta: 9:41:05  time: 0.4720  data_time: 0.0047  memory: 11497  grad_norm: 735.8698  loss: 413.9604  loss_cls: 147.8014  loss_bbox: 123.3400  loss_dfl: 142.8190
2024/03/15 16:32:06 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:32:33 - mmengine - INFO - Epoch(train)  [3][ 50/925]  lr: 1.3348e-04  eta: 9:41:32  time: 0.5331  data_time: 0.0558  memory: 11591  grad_norm: 791.2741  loss: 411.4333  loss_cls: 148.5960  loss_bbox: 121.1281  loss_dfl: 141.7092
2024/03/15 16:32:57 - mmengine - INFO - Epoch(train)  [3][100/925]  lr: 1.3699e-04  eta: 9:40:46  time: 0.4724  data_time: 0.0047  memory: 11537  grad_norm: 756.8158  loss: 414.4460  loss_cls: 149.2828  loss_bbox: 122.2934  loss_dfl: 142.8698
2024/03/15 16:33:21 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:33:21 - mmengine - INFO - Epoch(train)  [3][150/925]  lr: 1.4051e-04  eta: 9:40:08  time: 0.4756  data_time: 0.0047  memory: 11457  grad_norm: 741.1044  loss: 413.0435  loss_cls: 146.6127  loss_bbox: 123.5653  loss_dfl: 142.8655
2024/03/15 16:33:44 - mmengine - INFO - Epoch(train)  [3][200/925]  lr: 1.4402e-04  eta: 9:39:10  time: 0.4642  data_time: 0.0045  memory: 11391  grad_norm: 791.1428  loss: 417.2338  loss_cls: 150.0461  loss_bbox: 123.5253  loss_dfl: 143.6624
2024/03/15 16:34:07 - mmengine - INFO - Epoch(train)  [3][250/925]  lr: 1.4754e-04  eta: 9:38:27  time: 0.4723  data_time: 0.0048  memory: 11297  grad_norm: 751.2373  loss: 414.7239  loss_cls: 148.5880  loss_bbox: 123.3747  loss_dfl: 142.7612
2024/03/15 16:34:31 - mmengine - INFO - Epoch(train)  [3][300/925]  lr: 1.5105e-04  eta: 9:37:46  time: 0.4727  data_time: 0.0048  memory: 11364  grad_norm: 752.3860  loss: 402.4461  loss_cls: 143.6677  loss_bbox: 118.4292  loss_dfl: 140.3493
2024/03/15 16:34:55 - mmengine - INFO - Epoch(train)  [3][350/925]  lr: 1.5456e-04  eta: 9:37:06  time: 0.4726  data_time: 0.0049  memory: 11724  grad_norm: 771.6746  loss: 420.1259  loss_cls: 151.5801  loss_bbox: 125.4478  loss_dfl: 143.0981
2024/03/15 16:35:18 - mmengine - INFO - Epoch(train)  [3][400/925]  lr: 1.5808e-04  eta: 9:36:17  time: 0.4666  data_time: 0.0044  memory: 11471  grad_norm: 793.5751  loss: 417.8385  loss_cls: 150.9237  loss_bbox: 123.2844  loss_dfl: 143.6304
2024/03/15 16:35:43 - mmengine - INFO - Epoch(train)  [3][450/925]  lr: 1.6159e-04  eta: 9:36:02  time: 0.4880  data_time: 0.0047  memory: 11457  grad_norm: 764.7687  loss: 420.2951  loss_cls: 151.3505  loss_bbox: 125.7738  loss_dfl: 143.1708
2024/03/15 16:36:06 - mmengine - INFO - Epoch(train)  [3][500/925]  lr: 1.6511e-04  eta: 9:35:32  time: 0.4781  data_time: 0.0049  memory: 11697  grad_norm: 776.4341  loss: 421.6394  loss_cls: 152.0562  loss_bbox: 125.8199  loss_dfl: 143.7634
2024/03/15 16:36:30 - mmengine - INFO - Epoch(train)  [3][550/925]  lr: 1.6862e-04  eta: 9:34:43  time: 0.4653  data_time: 0.0046  memory: 11284  grad_norm: 793.4142  loss: 423.6822  loss_cls: 152.8281  loss_bbox: 126.4344  loss_dfl: 144.4198
2024/03/15 16:36:54 - mmengine - INFO - Epoch(train)  [3][600/925]  lr: 1.7214e-04  eta: 9:34:14  time: 0.4780  data_time: 0.0047  memory: 11177  grad_norm: inf  loss: 414.4807  loss_cls: 149.9818  loss_bbox: 122.4875  loss_dfl: 142.0114
2024/03/15 16:37:18 - mmengine - INFO - Epoch(train)  [3][650/925]  lr: 1.7565e-04  eta: 9:33:46  time: 0.4789  data_time: 0.0045  memory: 11484  grad_norm: 752.1249  loss: 418.3147  loss_cls: 148.4957  loss_bbox: 125.9196  loss_dfl: 143.8994
2024/03/15 16:37:42 - mmengine - INFO - Epoch(train)  [3][700/925]  lr: 1.7916e-04  eta: 9:33:22  time: 0.4813  data_time: 0.0049  memory: 11537  grad_norm: 806.5516  loss: 425.4617  loss_cls: 152.4550  loss_bbox: 128.0599  loss_dfl: 144.9468
2024/03/15 16:38:05 - mmengine - INFO - Epoch(train)  [3][750/925]  lr: 1.8268e-04  eta: 9:32:43  time: 0.4706  data_time: 0.0047  memory: 11591  grad_norm: 788.5291  loss: 418.3670  loss_cls: 151.0146  loss_bbox: 123.5574  loss_dfl: 143.7950
2024/03/15 16:38:29 - mmengine - INFO - Epoch(train)  [3][800/925]  lr: 1.8619e-04  eta: 9:32:16  time: 0.4791  data_time: 0.0048  memory: 11244  grad_norm: 781.7332  loss: 421.9915  loss_cls: 152.4264  loss_bbox: 125.2268  loss_dfl: 144.3383
2024/03/15 16:38:53 - mmengine - INFO - Epoch(train)  [3][850/925]  lr: 1.8971e-04  eta: 9:31:52  time: 0.4819  data_time: 0.0046  memory: 11177  grad_norm: 781.4488  loss: 418.2716  loss_cls: 150.7085  loss_bbox: 123.8938  loss_dfl: 143.6692
2024/03/15 16:39:17 - mmengine - INFO - Epoch(train)  [3][900/925]  lr: 1.9322e-04  eta: 9:31:09  time: 0.4660  data_time: 0.0046  memory: 11591  grad_norm: 743.8437  loss: 420.1414  loss_cls: 152.0295  loss_bbox: 124.6641  loss_dfl: 143.4478
2024/03/15 16:39:28 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:39:56 - mmengine - INFO - Epoch(train)  [4][ 50/925]  lr: 1.9258e-04  eta: 9:31:38  time: 0.5408  data_time: 0.0514  memory: 11377  grad_norm: 846.3355  loss: 413.7974  loss_cls: 148.8526  loss_bbox: 123.2204  loss_dfl: 141.7244
2024/03/15 16:40:19 - mmengine - INFO - Epoch(train)  [4][100/925]  lr: 1.9258e-04  eta: 9:30:59  time: 0.4700  data_time: 0.0018  memory: 11311  grad_norm: 840.2444  loss: 424.9210  loss_cls: 154.1513  loss_bbox: 126.0876  loss_dfl: 144.6821
2024/03/15 16:40:43 - mmengine - INFO - Epoch(train)  [4][150/925]  lr: 1.9258e-04  eta: 9:30:21  time: 0.4701  data_time: 0.0018  memory: 11617  grad_norm: 757.9608  loss: 427.7514  loss_cls: 155.5139  loss_bbox: 127.5539  loss_dfl: 144.6835
2024/03/15 16:41:07 - mmengine - INFO - Epoch(train)  [4][200/925]  lr: 1.9258e-04  eta: 9:29:56  time: 0.4808  data_time: 0.0018  memory: 11497  grad_norm: 738.7337  loss: 427.7940  loss_cls: 154.4997  loss_bbox: 127.3853  loss_dfl: 145.9090
2024/03/15 16:41:19 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:41:31 - mmengine - INFO - Epoch(train)  [4][250/925]  lr: 1.9258e-04  eta: 9:29:31  time: 0.4809  data_time: 0.0018  memory: 11351  grad_norm: 724.3692  loss: 423.4680  loss_cls: 153.6985  loss_bbox: 125.8745  loss_dfl: 143.8950
2024/03/15 16:41:54 - mmengine - INFO - Epoch(train)  [4][300/925]  lr: 1.9258e-04  eta: 9:28:52  time: 0.4683  data_time: 0.0017  memory: 11364  grad_norm: 802.7019  loss: 426.9299  loss_cls: 154.8474  loss_bbox: 126.6810  loss_dfl: 145.4015
2024/03/15 16:42:18 - mmengine - INFO - Epoch(train)  [4][350/925]  lr: 1.9258e-04  eta: 9:28:23  time: 0.4770  data_time: 0.0018  memory: 11471  grad_norm: 773.6626  loss: 418.8825  loss_cls: 151.5354  loss_bbox: 123.5200  loss_dfl: 143.8271
2024/03/15 16:42:42 - mmengine - INFO - Epoch(train)  [4][400/925]  lr: 1.9258e-04  eta: 9:27:52  time: 0.4748  data_time: 0.0018  memory: 11444  grad_norm: 797.0045  loss: 422.8512  loss_cls: 151.5049  loss_bbox: 126.9762  loss_dfl: 144.3701
2024/03/15 16:43:05 - mmengine - INFO - Epoch(train)  [4][450/925]  lr: 1.9258e-04  eta: 9:27:08  time: 0.4629  data_time: 0.0019  memory: 11431  grad_norm: 765.5104  loss: 417.1834  loss_cls: 151.3039  loss_bbox: 123.5429  loss_dfl: 142.3367
2024/03/15 16:43:29 - mmengine - INFO - Epoch(train)  [4][500/925]  lr: 1.9258e-04  eta: 9:26:42  time: 0.4789  data_time: 0.0018  memory: 11164  grad_norm: 748.9262  loss: 425.9064  loss_cls: 155.7105  loss_bbox: 125.1608  loss_dfl: 145.0351
2024/03/15 16:43:53 - mmengine - INFO - Epoch(train)  [4][550/925]  lr: 1.9258e-04  eta: 9:26:13  time: 0.4764  data_time: 0.0017  memory: 11471  grad_norm: 696.1131  loss: 424.1156  loss_cls: 154.1871  loss_bbox: 125.6517  loss_dfl: 144.2768
2024/03/15 16:44:16 - mmengine - INFO - Epoch(train)  [4][600/925]  lr: 1.9258e-04  eta: 9:25:36  time: 0.4681  data_time: 0.0019  memory: 11071  grad_norm: 732.9221  loss: 420.6391  loss_cls: 151.0640  loss_bbox: 125.7588  loss_dfl: 143.8164
2024/03/15 16:44:40 - mmengine - INFO - Epoch(train)  [4][650/925]  lr: 1.9258e-04  eta: 9:25:09  time: 0.4776  data_time: 0.0019  memory: 11604  grad_norm: 756.3554  loss: 418.5091  loss_cls: 150.6214  loss_bbox: 125.3326  loss_dfl: 142.5551
2024/03/15 16:45:04 - mmengine - INFO - Epoch(train)  [4][700/925]  lr: 1.9258e-04  eta: 9:24:42  time: 0.4777  data_time: 0.0019  memory: 11244  grad_norm: 761.7258  loss: 419.0587  loss_cls: 151.2744  loss_bbox: 123.7322  loss_dfl: 144.0520
2024/03/15 16:45:28 - mmengine - INFO - Epoch(train)  [4][750/925]  lr: 1.9258e-04  eta: 9:24:14  time: 0.4766  data_time: 0.0019  memory: 11364  grad_norm: 745.9792  loss: 420.9096  loss_cls: 150.0314  loss_bbox: 126.7636  loss_dfl: 144.1146
2024/03/15 16:45:51 - mmengine - INFO - Epoch(train)  [4][800/925]  lr: 1.9258e-04  eta: 9:23:34  time: 0.4634  data_time: 0.0018  memory: 11311  grad_norm: 736.6421  loss: 424.5623  loss_cls: 153.1537  loss_bbox: 127.2009  loss_dfl: 144.2077
2024/03/15 16:46:16 - mmengine - INFO - Epoch(train)  [4][850/925]  lr: 1.9258e-04  eta: 9:23:21  time: 0.4915  data_time: 0.0018  memory: 11297  grad_norm: 776.4159  loss: 417.8380  loss_cls: 149.0427  loss_bbox: 125.2182  loss_dfl: 143.5771
2024/03/15 16:46:40 - mmengine - INFO - Epoch(train)  [4][900/925]  lr: 1.9258e-04  eta: 9:22:58  time: 0.4823  data_time: 0.0018  memory: 11377  grad_norm: 787.5051  loss: 416.1993  loss_cls: 148.7611  loss_bbox: 123.3235  loss_dfl: 144.1147
2024/03/15 16:46:51 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:47:18 - mmengine - INFO - Epoch(train)  [5][ 50/925]  lr: 1.9258e-04  eta: 9:23:08  time: 0.5472  data_time: 0.0472  memory: 11377  grad_norm: 711.5931  loss: 424.6482  loss_cls: 153.5620  loss_bbox: 127.1742  loss_dfl: 143.9120
2024/03/15 16:47:43 - mmengine - INFO - Epoch(train)  [5][100/925]  lr: 1.9258e-04  eta: 9:22:50  time: 0.4883  data_time: 0.0018  memory: 11271  grad_norm: 741.8803  loss: 418.2314  loss_cls: 151.1732  loss_bbox: 123.3328  loss_dfl: 143.7254
2024/03/15 16:48:07 - mmengine - INFO - Epoch(train)  [5][150/925]  lr: 1.9258e-04  eta: 9:22:33  time: 0.4879  data_time: 0.0018  memory: 11324  grad_norm: 725.5641  loss: 416.2239  loss_cls: 149.4518  loss_bbox: 123.3476  loss_dfl: 143.4245
2024/03/15 16:48:31 - mmengine - INFO - Epoch(train)  [5][200/925]  lr: 1.9258e-04  eta: 9:21:56  time: 0.4678  data_time: 0.0018  memory: 11337  grad_norm: 757.6571  loss: 427.3945  loss_cls: 154.3230  loss_bbox: 126.9904  loss_dfl: 146.0810
2024/03/15 16:48:55 - mmengine - INFO - Epoch(train)  [5][250/925]  lr: 1.9258e-04  eta: 9:21:39  time: 0.4880  data_time: 0.0018  memory: 11257  grad_norm: 718.5845  loss: 418.5219  loss_cls: 149.8339  loss_bbox: 125.5260  loss_dfl: 143.1619
2024/03/15 16:49:20 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:49:20 - mmengine - INFO - Epoch(train)  [5][300/925]  lr: 1.9258e-04  eta: 9:21:24  time: 0.4915  data_time: 0.0018  memory: 11391  grad_norm: 746.0813  loss: 417.2469  loss_cls: 149.4160  loss_bbox: 123.9316  loss_dfl: 143.8993
2024/03/15 16:49:43 - mmengine - INFO - Epoch(train)  [5][350/925]  lr: 1.9258e-04  eta: 9:20:54  time: 0.4751  data_time: 0.0019  memory: 11231  grad_norm: 786.6009  loss: 416.9541  loss_cls: 149.2718  loss_bbox: 124.7095  loss_dfl: 142.9729
2024/03/15 16:50:08 - mmengine - INFO - Epoch(train)  [5][400/925]  lr: 1.9258e-04  eta: 9:20:39  time: 0.4910  data_time: 0.0018  memory: 11377  grad_norm: 778.6423  loss: 424.1211  loss_cls: 153.0846  loss_bbox: 126.5452  loss_dfl: 144.4913
2024/03/15 16:50:32 - mmengine - INFO - Epoch(train)  [5][450/925]  lr: 1.9258e-04  eta: 9:20:21  time: 0.4885  data_time: 0.0018  memory: 11484  grad_norm: 701.9506  loss: 417.6811  loss_cls: 150.0850  loss_bbox: 124.3821  loss_dfl: 143.2140
2024/03/15 16:50:57 - mmengine - INFO - Epoch(train)  [5][500/925]  lr: 1.9258e-04  eta: 9:19:57  time: 0.4821  data_time: 0.0020  memory: 11844  grad_norm: 744.3425  loss: 422.0160  loss_cls: 150.9617  loss_bbox: 125.3786  loss_dfl: 145.6757
2024/03/15 16:51:20 - mmengine - INFO - Epoch(train)  [5][550/925]  lr: 1.9258e-04  eta: 9:19:30  time: 0.4776  data_time: 0.0018  memory: 11337  grad_norm: 749.4043  loss: 425.6322  loss_cls: 151.7524  loss_bbox: 127.7883  loss_dfl: 146.0916
2024/03/15 16:51:45 - mmengine - INFO - Epoch(train)  [5][600/925]  lr: 1.9258e-04  eta: 9:19:15  time: 0.4924  data_time: 0.0019  memory: 11284  grad_norm: 755.8023  loss: 414.8110  loss_cls: 147.3276  loss_bbox: 123.4438  loss_dfl: 144.0396
2024/03/15 16:52:10 - mmengine - INFO - Epoch(train)  [5][650/925]  lr: 1.9258e-04  eta: 9:18:58  time: 0.4902  data_time: 0.0018  memory: 11484  grad_norm: 813.1651  loss: 418.8702  loss_cls: 148.6041  loss_bbox: 126.1573  loss_dfl: 144.1087
2024/03/15 16:52:33 - mmengine - INFO - Epoch(train)  [5][700/925]  lr: 1.9258e-04  eta: 9:18:28  time: 0.4735  data_time: 0.0018  memory: 11444  grad_norm: 755.9740  loss: 422.6757  loss_cls: 151.6788  loss_bbox: 125.4517  loss_dfl: 145.5452
2024/03/15 16:52:58 - mmengine - INFO - Epoch(train)  [5][750/925]  lr: 1.9258e-04  eta: 9:18:07  time: 0.4861  data_time: 0.0018  memory: 11751  grad_norm: 685.1831  loss: 417.5276  loss_cls: 148.6724  loss_bbox: 125.7465  loss_dfl: 143.1088
2024/03/15 16:53:22 - mmengine - INFO - Epoch(train)  [5][800/925]  lr: 1.9258e-04  eta: 9:17:50  time: 0.4902  data_time: 0.0018  memory: 11177  grad_norm: 824.9883  loss: 418.6440  loss_cls: 148.8286  loss_bbox: 125.4005  loss_dfl: 144.4149
2024/03/15 16:53:46 - mmengine - INFO - Epoch(train)  [5][850/925]  lr: 1.9258e-04  eta: 9:17:16  time: 0.4686  data_time: 0.0017  memory: 11471  grad_norm: 742.1786  loss: 420.7426  loss_cls: 151.2453  loss_bbox: 125.8574  loss_dfl: 143.6399
2024/03/15 16:54:10 - mmengine - INFO - Epoch(train)  [5][900/925]  lr: 1.9258e-04  eta: 9:16:57  time: 0.4889  data_time: 0.0018  memory: 11511  grad_norm: 708.2669  loss: 424.4981  loss_cls: 151.7956  loss_bbox: 127.1316  loss_dfl: 145.5709
2024/03/15 16:54:22 - mmengine - INFO - Exp name: yolo_world_l_efficient_neck_2e-4_80e_8gpus_mask-refine_finetune_coco_20240315_161620
2024/03/15 16:54:22 - mmengine - INFO - Saving checkpoint at 5 epochs
2024/03/15 16:54:25 - mmengine - WARNING - `save_param_scheduler` is True but `self.param_schedulers` is None, so skip saving parameter schedulers
2024/03/15 16:54:33 - mmengine - INFO - Epoch(val)  [5][ 50/625]    eta: 0:00:48  time: 0.0836  data_time: 0.0042  memory: 14329  
2024/03/15 16:54:35 - mmengine - INFO - Epoch(val)  [5][100/625]    eta: 0:00:36  time: 0.0541  data_time: 0.0003  memory: 1696  
2024/03/15 16:54:38 - mmengine - INFO - Epoch(val)  [5][150/625]    eta: 0:00:30  time: 0.0522  data_time: 0.0002  memory: 1696  
2024/03/15 16:54:41 - mmengine - INFO - Epoch(val)  [5][200/625]    eta: 0:00:25  time: 0.0496  data_time: 0.0003  memory: 1696  
2024/03/15 16:54:43 - mmengine - INFO - Epoch(val)  [5][250/625]    eta: 0:00:21  time: 0.0532  data_time: 0.0003  memory: 1696  
2024/03/15 16:54:46 - mmengine - INFO - Epoch(val)  [5][300/625]    eta: 0:00:18  time: 0.0566  data_time: 0.0003  memory: 1696  
2024/03/15 16:54:49 - mmengine - INFO - Epoch(val)  [5][350/625]    eta: 0:00:15  time: 0.0503  data_time: 0.0002  memory: 1696  
2024/03/15 16:54:51 - mmengine - INFO - Epoch(val)  [5][400/625]    eta: 0:00:12  time: 0.0552  data_time: 0.0002  memory: 1696  
2024/03/15 16:54:54 - mmengine - INFO - Epoch(val)  [5][450/625]    eta: 0:00:09  time: 0.0533  data_time: 0.0003  memory: 1696  
2024/03/15 16:54:56 - mmengine - INFO - Epoch(val)  [5][500/625]    eta: 0:00:06  time: 0.0503  data_time: 0.0002  memory: 1696  
2024/03/15 16:54:59 - mmengine - INFO - Epoch(val)  [5][550/625]    eta: 0:00:04  time: 0.0552  data_time: 0.0003  memory: 1696  
2024/03/15 16:55:02 - mmengine - INFO - Epoch(val)  [5][600/625]    eta: 0:00:01  time: 0.0560  data_time: 0.0003  memory: 1696  
2024/03/15 16:55:28 - mmengine - INFO - Evaluating bbox...
2024/03/15 16:57:24 - mmengine - INFO - bbox_mAP_copypaste: 0.411 0.563 0.449 0.252 0.461 0.534

博主 想问一下为什么我微调前在自己的数据集上测试ap大概在0.26,微调后反而变得更低只有0.08,用的是v2x预训练权重

wondervictor commented 5 months ago

@KDgggg 可以介绍一下数据和config配置吗,另外可以检查一下标注有没有问题,Text JSON和类别有没有对应。