Closed NjustWrunning closed 2 years ago
batch_size = 4 lr = 0.01 150个epoch Loss 从1 降低到0.5 训练速度很慢,所以就将batch_size 调大了 batch_size = 64 lr =0.16 300个epoch Loss从1降低到0.4
接着当前权重把mosaic 或mixup等多样本数据增强关闭,继续训练还能接着收敛
batch_size = 4 lr = 0.01 150个epoch Loss 从1 降低到0.5 训练速度很慢,所以就将batch_size 调大了 batch_size = 64 lr =0.16 300个epoch Loss从1降低到0.4