zhenhuat / STCFormer

(CVPR2023)3D Human Pose Estimation with Spatio-Temporal Criss-cross Attention
87 stars 5 forks source link

关于243帧实验参数设置 #3

Open zwr-04 opened 1 year ago

zwr-04 commented 1 year ago

您好,请问一下,如果希望能复现论文中在Human3.6M上243帧测得的结果(STCFormer和STCFormer-L),对于243帧训练的参数完整设置是怎么样的呢?如果方便地话,希望您能提供训练的脚本,包括batchsize以及stride等参数设置,谢谢!

kyky233 commented 1 year ago

Hello, I notice that for the 27 frames and 81 frames experiment, you set tds = 3, but here for the 243 frames experiment, you use tds = 2? Could you please explain what is "tds" and why you use different values for different experiments?

zhenhuat commented 1 year ago

Hello, I notice that for the 27 frames and 81 frames experiment, you set tds = 3, but here for the 243 frames experiment, you use tds = 2? Could you please explain what is "tds" and why you use different values for different experiments?

Hi, tds, which stands for Temporal Downsampling Strategy, was introduced in the paper titled "P-STMO: Pre-Trained Spatial Temporal Many-to-One Model for 3D Human Pose Estimation." This approach enables the network to encompass an extended time range of information using an unaltered quantity of input frames. This effectively results in the enlargement of the receptive field within the temporal domain, all the while avoiding any escalation in parameter count and computational intricacy. In our experiment comprising 243 frames, we adopted a tds factor of 2 due to the video's duration.

zhenhuat commented 1 year ago

您好,请问一下,如果希望能复现论文中在Human3.6M上243帧测得的结果(STCFormer和STCFormer-L),对于243帧训练的参数完整设置是怎么样的呢?如果方便地话,希望您能提供训练的脚本,包括batchsize以及stride等参数设置,谢谢!

您好,因为我目前没有算力去跑243帧了,所以我无法确定具体的参数设置,您可以自己尝试增大batchsize或stride

STCFormer-243: `python run_stc.py -f 243-b 128 --train 1 --layers 6 -tds 2 -s 3 ’

STCFormer-L-243: `python run_stc.py -f 243-b 128 --train 1 --layers 6 -tds 2 -s 3 --d_hid 512'

正常情况下输出结果应该在41mm左右,将243帧的test估计结果与81帧的test估计结果集成(坐标相加求平均),误差会在40.5mm左右。

zwr-04 commented 1 year ago

好的👌 请问一下您的243帧结果是在2080ti单卡上跑的吗,我运行起来看似显存是不够的

zhenhuat commented 1 year ago

好的👌 请问一下您的243帧结果是在2080ti单卡上跑的吗,我运行起来看似显存是不够的

想要在2080ti上运行243帧模型的话需要减小-b,增大-s,实际的batch会变成batch_size/s,但是性能达不到最优。我最好的结果是用A100单卡跑的

zwr-04 commented 1 year ago

嗯嗯好的,非常感谢!我将尝试复现一下~

gtftyih commented 11 months ago

好的👌 请问一下您的243帧结果是在2080ti单卡上跑的吗,我运行起来看似显存是不够的

想要在2080ti上运行243帧模型的话需要减小-b,增大-s,实际的batch会变成batch_size/s,但是性能达不到最优。我最好的结果是用A100单卡跑的

您好,请问您在243帧实验中大概经过几轮得到最佳结果?方便的话可以提供一下当时的训练参数和训练日志吗?

zhenhuat commented 11 months ago

好的👌 请问一下您的243帧结果是在2080ti单卡上跑的吗,我运行起来看似显存是不够的

想要在2080ti上运行243帧模型的话需要减小-b,增大-s,实际的batch会变成batch_size/s,但是性能达不到最优。我最好的结果是用A100单卡跑的

您好,请问您在243帧实验中大概经过几轮得到最佳结果?方便的话可以提供一下当时的训练参数和训练日志吗?

哈喽,所有帧数都是在20个epoch内就会收敛了。因为我没有算力了,所以没有办法提供训练参数。以下是在A100上的训练日志,内容比较混乱,包括CPN-243,GT-243,还有STCFormer_Larger的结果。 另外,论文中CPN的41.0和40.5,是使用日志中的结果(大概41.4和41.0,对应11.08-11.17的训练时间)和81帧的结果emsamble得到的。日志最末的p1: 40.46, p2: 31.00 是后来调了一下参数得到的,但具体我也不记得是什么了。 2022/10/23 06:21:36 epoch: 1, lr: 0.0010000, loss: 0.0276, MPJPE: 27.26, p1: 42.79, p2: 33.74 2022/10/24 02:23:16 epoch: 2, lr: 0.0009600, loss: 0.0169, MPJPE: 16.83, p1: 42.77, p2: 33.45 2022/10/24 22:24:32 epoch: 3, lr: 0.0009216, loss: 0.0151, MPJPE: 15.02, p1: 43.30, p2: 33.51 2022/10/25 18:25:18 epoch: 4, lr: 0.0008847, loss: 0.0141, MPJPE: 14.07, p1: 43.47, p2: 33.51 2022/10/26 14:25:54 epoch: 5, lr: 0.0008493, loss: 0.0135, MPJPE: 13.44, p1: 42.99, p2: 33.50 2022/11/07 14:56:00 epoch: 1, lr: 0.0010000, loss: 0.1684, p1: 53.15, p2: 37.69 2022/11/07 19:41:32 epoch: 2, lr: 0.0009600, loss: 0.1010, p1: 46.26, p2: 35.66 2022/11/07 20:19:48 epoch: 1, lr: 0.0010000, loss: 0.0616, p1: 53.44, p2: 38.25 2022/11/08 00:36:14 epoch: 1, lr: 0.0010000, loss: 0.1808, p1: 49.29, p2: 36.80 2022/11/08 00:52:54 epoch: 1, lr: 0.0010000, loss: 0.1788, p1: 48.94, p2: 36.88 2022/11/08 02:08:35 epoch: 3, lr: 0.0009216, loss: 0.0868, p1: 45.43, p2: 34.45 2022/11/08 02:42:28 epoch: 2, lr: 0.0009600, loss: 0.0353, p1: 46.01, p2: 35.90 2022/11/08 03:49:19 epoch: 2, lr: 0.0009600, loss: 0.0962, p1: 44.77, p2: 34.58 2022/11/08 03:56:53 epoch: 2, lr: 0.0009600, loss: 0.0945, p1: 45.17, p2: 34.69 2022/11/08 08:17:49 epoch: 3, lr: 0.0009216, loss: 0.0780, p1: 44.33, p2: 33.83 2022/11/08 08:17:59 epoch: 4, lr: 0.0008847, loss: 0.0786, p1: 43.60, p2: 34.18 2022/11/08 08:18:19 epoch: 3, lr: 0.0009216, loss: 0.0792, p1: 43.32, p2: 33.27 2022/11/08 08:53:23 epoch: 3, lr: 0.0009216, loss: 0.0298, p1: 47.55, p2: 34.86 2022/11/08 11:39:41 epoch: 4, lr: 0.0008847, loss: 0.0688, p1: 44.29, p2: 33.04 2022/11/08 11:48:50 epoch: 4, lr: 0.0008847, loss: 0.0695, p1: 43.09, p2: 33.04 2022/11/08 14:37:18 epoch: 5, lr: 0.0008493, loss: 0.0732, p1: 44.54, p2: 33.55 2022/11/08 15:05:33 epoch: 4, lr: 0.0008847, loss: 0.0268, p1: 43.61, p2: 34.32 2022/11/08 15:31:48 epoch: 5, lr: 0.0008493, loss: 0.0626, p1: 44.13, p2: 33.02 2022/11/08 15:45:34 epoch: 5, lr: 0.0008493, loss: 0.0634, p1: 42.23, p2: 33.02 2022/11/08 21:50:21 epoch: 6, lr: 0.0008154, loss: 0.0583, p1: 42.83, p2: 32.73 2022/11/08 22:35:18 epoch: 6, lr: 0.0008154, loss: 0.0588, p1: 41.42, p2: 32.64 2022/11/08 22:58:06 epoch: 6, lr: 0.0008154, loss: 0.0692, p1: 44.41, p2: 33.73 2022/11/08 23:37:17 epoch: 5, lr: 0.0008493, loss: 0.0248, p1: 43.45, p2: 33.95 2022/11/09 05:45:32 epoch: 7, lr: 0.0007828, loss: 0.0662, p1: 43.10, p2: 33.17 2022/11/09 06:29:04 epoch: 7, lr: 0.0007828, loss: 0.0554, p1: 42.13, p2: 32.87 2022/11/09 06:52:12 epoch: 6, lr: 0.0008154, loss: 0.0233, p1: 43.70, p2: 33.56 2022/11/09 09:54:37 epoch: 1, lr: 0.0010000, loss: 0.0578, p1: 31.00, p2: 23.68 2022/11/09 12:16:40 epoch: 8, lr: 0.0007514, loss: 0.0638, p1: 44.49, p2: 33.49 2022/11/09 13:48:22 epoch: 7, lr: 0.0007828, loss: 0.0222, p1: 43.83, p2: 33.64 2022/11/09 17:04:43 epoch: 8, lr: 0.0007514, loss: 0.0526, p1: 43.04, p2: 32.47 2022/11/09 20:37:41 epoch: 2, lr: 0.0009600, loss: 0.0222, p1: 28.83, p2: 20.26 2022/11/09 21:08:04 epoch: 9, lr: 0.0007214, loss: 0.0618, p1: 43.10, p2: 33.25 2022/11/09 22:16:23 epoch: 8, lr: 0.0007514, loss: 0.0214, p1: 42.64, p2: 33.40 2022/11/10 05:33:46 epoch: 9, lr: 0.0007214, loss: 0.0504, p1: 41.96, p2: 32.21 2022/11/10 06:48:50 epoch: 10, lr: 0.0006925, loss: 0.0601, p1: 43.43, p2: 33.11 2022/11/10 07:12:35 epoch: 9, lr: 0.0007214, loss: 0.0206, p1: 43.42, p2: 33.61 2022/11/10 08:05:48 epoch: 3, lr: 0.0009216, loss: 0.0170, p1: 27.06, p2: 19.46 2022/11/10 16:42:51 epoch: 11, lr: 0.0006648, loss: 0.0586, p1: 43.18, p2: 32.74 2022/11/10 17:12:10 epoch: 10, lr: 0.0006925, loss: 0.0485, p1: 41.93, p2: 32.49 2022/11/10 17:19:03 epoch: 10, lr: 0.0006925, loss: 0.0200, p1: 42.92, p2: 33.39 2022/11/10 19:56:08 epoch: 4, lr: 0.0008847, loss: 0.0144, p1: 25.01, p2: 19.06 2022/11/10 22:50:38 epoch: 12, lr: 0.0006382, loss: 0.0574, p1: 44.83, p2: 33.10 2022/11/11 00:13:18 epoch: 11, lr: 0.0006648, loss: 0.0195, p1: 42.30, p2: 32.96 2022/11/11 03:04:11 epoch: 11, lr: 0.0006648, loss: 0.0468, p1: 41.61, p2: 32.40 2022/11/11 05:41:32 epoch: 13, lr: 0.0006127, loss: 0.0562, p1: 45.02, p2: 33.13 2022/11/11 05:44:55 epoch: 5, lr: 0.0008493, loss: 0.0127, p1: 24.14, p2: 18.16 2022/11/11 06:36:11 epoch: 12, lr: 0.0006382, loss: 0.0190, p1: 42.80, p2: 33.16 2022/11/11 12:18:23 epoch: 13, lr: 0.0006127, loss: 0.0186, p1: 42.82, p2: 33.08 2022/11/11 17:18:10 epoch: 6, lr: 0.0008154, loss: 0.0116, p1: 24.16, p2: 17.81 2022/11/11 20:18:06 epoch: 1, lr: 0.0010000, loss: 0.0539, p1: 32.59, p2: 22.37 2022/11/11 20:50:06 epoch: 14, lr: 0.0005882, loss: 0.0182, p1: 42.85, p2: 32.88 2022/11/11 21:05:49 epoch: 1, lr: 0.0010000, loss: 0.0634, p1: 49.05, p2: 36.30 2022/11/12 01:33:26 epoch: 1, lr: 0.0007828, loss: 0.0920, p1: 44.11, p2: 33.14 2022/11/12 05:55:36 epoch: 15, lr: 0.0005647, loss: 0.0179, p1: 42.64, p2: 33.13 2022/11/12 06:13:33 epoch: 2, lr: 0.0009600, loss: 0.0209, p1: 27.05, p2: 20.32 2022/11/12 06:44:33 epoch: 7, lr: 0.0007828, loss: 0.0107, p1: 23.77, p2: 18.02 2022/11/12 06:58:48 epoch: 2, lr: 0.0009600, loss: 0.0303, p1: 46.49, p2: 34.53 2022/11/12 14:36:11 epoch: 16, lr: 0.0005421, loss: 0.0176, p1: 42.83, p2: 32.91 2022/11/12 14:59:07 epoch: 2, lr: 0.0007515, loss: 0.0639, p1: 42.46, p2: 32.41 2022/11/12 15:53:37 epoch: 3, lr: 0.0009216, loss: 0.0160, p1: 23.66, p2: 18.03 2022/11/12 16:50:31 epoch: 3, lr: 0.0009216, loss: 0.0248, p1: 44.49, p2: 33.73 2022/11/12 20:33:16 epoch: 8, lr: 0.0007514, loss: 0.0100, p1: 22.72, p2: 17.45 2022/11/12 22:49:50 epoch: 17, lr: 0.0005204, loss: 0.0173, p1: 43.09, p2: 32.88 2022/11/13 02:21:32 epoch: 4, lr: 0.0008847, loss: 0.0136, p1: 23.72, p2: 17.83 2022/11/13 03:25:59 epoch: 4, lr: 0.0008847, loss: 0.0216, p1: 43.64, p2: 33.30 2022/11/13 04:35:11 epoch: 3, lr: 0.0007214, loss: 0.0559, p1: 42.51, p2: 31.97 2022/11/13 07:50:07 epoch: 18, lr: 0.0004996, loss: 0.0171, p1: 42.46, p2: 32.98 2022/11/13 10:02:51 epoch: 9, lr: 0.0007214, loss: 0.0094, p1: 23.41, p2: 17.57 2022/11/13 12:09:31 epoch: 5, lr: 0.0008493, loss: 0.0120, p1: 24.48, p2: 17.77 2022/11/13 13:53:39 epoch: 5, lr: 0.0008493, loss: 0.0196, p1: 43.17, p2: 33.09 2022/11/13 16:45:25 epoch: 19, lr: 0.0004796, loss: 0.0168, p1: 42.51, p2: 33.16 2022/11/13 18:31:29 epoch: 4, lr: 0.0006926, loss: 0.0512, p1: 42.20, p2: 32.10 2022/11/13 22:08:05 epoch: 6, lr: 0.0008154, loss: 0.0109, p1: 23.85, p2: 17.48 2022/11/13 23:52:40 epoch: 10, lr: 0.0006925, loss: 0.0089, p1: 24.63, p2: 17.69 2022/11/14 00:01:13 epoch: 6, lr: 0.0008154, loss: 0.0181, p1: 43.03, p2: 32.89 2022/11/14 00:49:19 epoch: 20, lr: 0.0004604, loss: 0.0166, p1: 43.18, p2: 33.25 2022/11/14 08:10:58 epoch: 7, lr: 0.0007828, loss: 0.0101, p1: 22.45, p2: 17.03 2022/11/14 08:28:29 epoch: 5, lr: 0.0006649, loss: 0.0479, p1: 41.94, p2: 32.02 2022/11/14 09:48:05 epoch: 21, lr: 0.0004420, loss: 0.0164, p1: 42.45, p2: 32.96 2022/11/14 10:05:39 epoch: 7, lr: 0.0007828, loss: 0.0170, p1: 42.48, p2: 32.72 2022/11/14 13:46:41 epoch: 11, lr: 0.0006648, loss: 0.0085, p1: 23.93, p2: 17.39 2022/11/14 18:01:40 epoch: 8, lr: 0.0007514, loss: 0.0094, p1: 22.65, p2: 16.78 2022/11/14 18:39:01 epoch: 22, lr: 0.0004243, loss: 0.0162, p1: 42.66, p2: 33.30 2022/11/14 20:19:23 epoch: 8, lr: 0.0007514, loss: 0.0161, p1: 42.81, p2: 32.49 2022/11/14 22:11:29 epoch: 6, lr: 0.0006383, loss: 0.0455, p1: 42.24, p2: 32.07 2022/11/15 03:33:04 epoch: 12, lr: 0.0006382, loss: 0.0081, p1: 23.01, p2: 17.09 2022/11/15 04:04:16 epoch: 23, lr: 0.0004073, loss: 0.0161, p1: 42.12, p2: 33.01 2022/11/15 04:12:45 epoch: 9, lr: 0.0007214, loss: 0.0089, p1: 22.83, p2: 16.66 2022/11/15 04:48:38 epoch: 9, lr: 0.0007214, loss: 0.0154, p1: 42.88, p2: 32.47 2022/11/15 10:59:00 epoch: 7, lr: 0.0006127, loss: 0.0435, p1: 41.21, p2: 32.21 2022/11/15 11:44:38 epoch: 10, lr: 0.0006925, loss: 0.0148, p1: 43.39, p2: 32.63 2022/11/15 13:05:49 epoch: 10, lr: 0.0006925, loss: 0.0085, p1: 22.42, p2: 16.90 2022/11/15 13:53:47 epoch: 13, lr: 0.0006127, loss: 0.0078, p1: 23.02, p2: 17.26 2022/11/15 14:35:33 epoch: 11, lr: 0.0006648, loss: 0.0143, p1: 42.80, p2: 32.48 2022/11/15 18:05:37 epoch: 12, lr: 0.0006382, loss: 0.0138, p1: 43.40, p2: 32.63 2022/11/15 18:48:51 epoch: 8, lr: 0.0005882, loss: 0.0418, p1: 41.74, p2: 31.76 2022/11/15 19:43:58 epoch: 11, lr: 0.0006648, loss: 0.0081, p1: 22.01, p2: 16.39 2022/11/15 20:52:38 epoch: 13, lr: 0.0006127, loss: 0.0134, p1: 42.43, p2: 32.48 2022/11/15 21:49:25 epoch: 14, lr: 0.0005882, loss: 0.0076, p1: 23.14, p2: 17.10 2022/11/15 22:37:55 epoch: 12, lr: 0.0006382, loss: 0.0077, p1: 22.12, p2: 16.67 2022/11/15 23:37:14 epoch: 14, lr: 0.0005882, loss: 0.0130, p1: 42.89, p2: 32.42 2022/11/16 01:42:40 epoch: 13, lr: 0.0006127, loss: 0.0074, p1: 22.19, p2: 16.79 2022/11/16 02:23:48 epoch: 9, lr: 0.0005647, loss: 0.0404, p1: 40.95, p2: 31.87 2022/11/16 02:38:47 epoch: 15, lr: 0.0005647, loss: 0.0127, p1: 42.33, p2: 32.73 2022/11/16 04:32:51 epoch: 14, lr: 0.0005882, loss: 0.0071, p1: 22.48, p2: 16.54 2022/11/16 05:25:45 epoch: 15, lr: 0.0005647, loss: 0.0073, p1: 22.77, p2: 17.02 2022/11/16 05:28:48 epoch: 16, lr: 0.0005421, loss: 0.0124, p1: 43.04, p2: 32.66 2022/11/16 07:22:30 epoch: 15, lr: 0.0005647, loss: 0.0069, p1: 21.93, p2: 16.74 2022/11/16 08:16:42 epoch: 17, lr: 0.0005204, loss: 0.0122, p1: 42.28, p2: 32.51 2022/11/16 10:01:08 epoch: 10, lr: 0.0005421, loss: 0.0393, p1: 41.34, p2: 32.24 2022/11/16 10:26:06 epoch: 16, lr: 0.0005421, loss: 0.0066, p1: 22.14, p2: 16.39 2022/11/16 11:12:56 epoch: 18, lr: 0.0004996, loss: 0.0120, p1: 42.66, p2: 32.40 2022/11/16 13:08:35 epoch: 16, lr: 0.0005421, loss: 0.0070, p1: 23.79, p2: 17.11 2022/11/16 13:28:51 epoch: 17, lr: 0.0005204, loss: 0.0064, p1: 21.92, p2: 16.53 2022/11/16 14:12:16 epoch: 19, lr: 0.0004796, loss: 0.0117, p1: 42.73, p2: 32.64 2022/11/16 16:38:59 epoch: 18, lr: 0.0004996, loss: 0.0063, p1: 21.75, p2: 16.49 2022/11/16 17:20:50 epoch: 20, lr: 0.0004604, loss: 0.0115, p1: 42.33, p2: 32.37 2022/11/16 17:49:35 epoch: 11, lr: 0.0005204, loss: 0.0382, p1: 41.05, p2: 31.99 2022/11/16 19:25:13 epoch: 19, lr: 0.0004796, loss: 0.0061, p1: 21.86, p2: 16.45 2022/11/16 20:03:38 epoch: 21, lr: 0.0004420, loss: 0.0114, p1: 42.61, p2: 32.51 2022/11/16 20:48:04 epoch: 17, lr: 0.0005204, loss: 0.0068, p1: 22.36, p2: 17.06 2022/11/16 22:12:10 epoch: 20, lr: 0.0004604, loss: 0.0059, p1: 21.81, p2: 16.60 2022/11/16 22:46:20 epoch: 22, lr: 0.0004243, loss: 0.0112, p1: 42.32, p2: 32.46 2022/11/17 01:28:55 epoch: 12, lr: 0.0004996, loss: 0.0373, p1: 41.37, p2: 31.93 2022/11/17 01:29:16 epoch: 21, lr: 0.0004420, loss: 0.0058, p1: 21.80, p2: 16.73 2022/11/17 02:00:11 epoch: 23, lr: 0.0004073, loss: 0.0110, p1: 42.91, p2: 32.56 2022/11/17 04:33:53 epoch: 22, lr: 0.0004243, loss: 0.0056, p1: 22.11, p2: 16.52 2022/11/17 04:38:08 epoch: 18, lr: 0.0004996, loss: 0.0066, p1: 22.55, p2: 17.08 2022/11/17 05:01:50 epoch: 24, lr: 0.0003911, loss: 0.0109, p1: 42.61, p2: 32.41 2023/01/26 22:40:56 epoch: 1, lr: 0.0010000, loss: 0.0586, p1: 46.51, p2: 35.69 2023/01/27 00:51:28 epoch: 2, lr: 0.0009600, loss: 0.0248, p1: 47.34, p2: 34.42 2023/01/27 03:28:15 epoch: 3, lr: 0.0009216, loss: 0.0201, p1: 43.35, p2: 33.59 2023/01/27 05:38:49 epoch: 4, lr: 0.0008847, loss: 0.0177, p1: 42.90, p2: 33.25 2023/01/27 07:49:22 epoch: 5, lr: 0.0008493, loss: 0.0161, p1: 44.14, p2: 32.92 2023/01/27 09:59:55 epoch: 6, lr: 0.0008154, loss: 0.0150, p1: 44.21, p2: 32.95 2023/01/27 12:20:53 epoch: 7, lr: 0.0007828, loss: 0.0141, p1: 43.59, p2: 32.83 2023/01/27 14:31:23 epoch: 8, lr: 0.0007514, loss: 0.0134, p1: 42.82, p2: 32.63 2023/01/27 16:41:57 epoch: 9, lr: 0.0007214, loss: 0.0128, p1: 43.18, p2: 32.44 2023/01/27 18:52:30 epoch: 10, lr: 0.0006925, loss: 0.0123, p1: 44.75, p2: 32.65 2023/01/27 21:03:12 epoch: 11, lr: 0.0006648, loss: 0.0119, p1: 43.82, p2: 32.49 2023/01/28 02:20:25 epoch: 1, lr: 0.0010000, loss: 0.0469, p1: 46.08, p2: 34.19 2023/01/28 05:26:43 epoch: 2, lr: 0.0009600, loss: 0.0191, p1: 43.41, p2: 33.82 2023/01/28 08:33:00 epoch: 3, lr: 0.0009216, loss: 0.0158, p1: 43.30, p2: 33.16 2023/01/28 12:04:04 epoch: 4, lr: 0.0008847, loss: 0.0141, p1: 44.36, p2: 32.90 2023/01/28 15:10:17 epoch: 5, lr: 0.0008493, loss: 0.0129, p1: 42.63, p2: 33.17 2023/01/28 18:16:31 epoch: 6, lr: 0.0008154, loss: 0.0121, p1: 43.09, p2: 32.93 2023/01/28 19:37:37 epoch: 1, lr: 0.0010000, loss: 0.0233, p1: 46.17, p2: 34.93 2023/01/28 21:22:45 epoch: 7, lr: 0.0007828, loss: 0.0115, p1: 42.94, p2: 32.57 2023/01/28 22:44:13 epoch: 2, lr: 0.0009600, loss: 0.0095, p1: 44.65, p2: 34.68 2023/01/29 00:28:50 epoch: 8, lr: 0.0007514, loss: 0.0109, p1: 42.96, p2: 32.68 2023/01/29 01:51:02 epoch: 3, lr: 0.0009216, loss: 0.0079, p1: 44.69, p2: 33.93 2023/01/29 03:35:25 epoch: 9, lr: 0.0007214, loss: 0.0105, p1: 43.36, p2: 32.70 2023/01/29 04:58:14 epoch: 4, lr: 0.0008847, loss: 0.0070, p1: 43.92, p2: 33.27 2023/01/29 06:42:09 epoch: 10, lr: 0.0006925, loss: 0.0102, p1: 42.50, p2: 32.79 2023/01/29 08:05:26 epoch: 5, lr: 0.0008493, loss: 0.0065, p1: 44.10, p2: 33.40 2023/01/29 09:48:41 epoch: 11, lr: 0.0006648, loss: 0.0099, p1: 42.94, p2: 32.80 2023/01/29 11:12:28 epoch: 6, lr: 0.0008154, loss: 0.0061, p1: 45.01, p2: 33.34 2023/01/29 12:55:23 epoch: 12, lr: 0.0006382, loss: 0.0096, p1: 42.36, p2: 32.44 2023/01/29 14:20:39 epoch: 7, lr: 0.0007828, loss: 0.0058, p1: 43.60, p2: 33.21 2023/01/29 16:02:37 epoch: 13, lr: 0.0006127, loss: 0.0094, p1: 43.04, p2: 32.50 2023/01/29 17:29:17 epoch: 8, lr: 0.0007514, loss: 0.0055, p1: 43.74, p2: 33.25 2023/01/29 19:11:02 epoch: 14, lr: 0.0005882, loss: 0.0092, p1: 42.98, p2: 32.58 2023/01/29 20:37:39 epoch: 9, lr: 0.0007214, loss: 0.0053, p1: 43.20, p2: 33.01 2023/01/29 23:47:18 epoch: 10, lr: 0.0006925, loss: 0.0051, p1: 43.31, p2: 32.76 2023/01/30 02:59:32 epoch: 11, lr: 0.0006648, loss: 0.0050, p1: 43.34, p2: 33.01 2023/01/30 06:11:14 epoch: 12, lr: 0.0006382, loss: 0.0048, p1: 42.87, p2: 33.04 2023/01/30 09:22:07 epoch: 13, lr: 0.0006127, loss: 0.0047, p1: 43.06, p2: 32.81 2023/01/30 09:56:04 epoch: 1, lr: 0.0010000, loss: 0.0470, p1: 44.28, p2: 33.85 2023/01/30 12:30:40 epoch: 14, lr: 0.0005882, loss: 0.0046, p1: 42.99, p2: 33.00 2023/01/31 18:57:45 epoch: 1, lr: 0.0010000, loss: 0.0176, p1: 41.83, p2: 31.41 2023/01/31 22:48:30 epoch: 2, lr: 0.0009600, loss: 0.0118, p1: 42.07, p2: 31.39 2023/02/01 02:31:05 epoch: 3, lr: 0.0009216, loss: 0.0103, p1: 42.15, p2: 31.18 2023/02/01 06:13:42 epoch: 4, lr: 0.0008847, loss: 0.0095, p1: 40.46, p2: 31.00 2023/02/01 09:55:41 epoch: 5, lr: 0.0008493, loss: 0.0089, p1: 41.88, p2: 31.29

gtftyih commented 11 months ago

多谢,请问我可以看一下您的opt.txt吗?那里面应该有参数信息(如果日志最末对应的是最后一次训练的话)

zhenhuat commented 11 months ago

多谢,请问我可以看一下您的opt.txt吗?那里面应该有参数信息(如果日志最末对应的是最后一次训练的话)

这个可能不是很准确,关键参数就是batch、channel、stride和t_downsample。 ==> Args: MAE: False MAE_reload: 0 actions: * batchSize: 128 channel: 256 checkpoint: checkpoint/model_243_STMO crop_uv: 0 d_hid: 256 data_augmentation: True dataset: h36m downsample: 1 frames: 243 gpu: 0 in_channels: 2 keypoints: cpn_ft_h36m_dbb large_decay_epoch: 80 layers: 6 lr: 0.001 lr_decay: 0.96 lr_decay_large: 0.5 lr_refine: 1e-05 manualSeed: 1 n_joints: 17 nepoch: 80 out_all: 1 out_channels: 3 out_joints: 17 pad: 121 previous_best_threshold: inf previous_dir: ./checkpoint/model_243_STMO/no_refine_6_4142.pth previous_name: previous_refine_name: refine: False refine_reload: 0 reload: 1 resume: False reverse_augmentation: False root_path: ./dataset/ spatial_mask_num: 0 stride: 9 stride_num: [3, 3, 3, 3, 3] subjects_test: S9,S11 subjects_train: S1,S5,S6,S7,S8 subset: 1 t_downsample: 3 temporal_mask_rate: 0 test: 1 test_augmentation: True train: 1 workers: 4 ==> Args:

gtftyih commented 11 months ago

好的,多谢了👌