ZJU-lishuang / yolov5_prune

yolov5 prune,Support V2, V3, V4 and V6 versions of yolov5
Apache License 2.0
556 stars 136 forks source link

请问微调训练后,怎么测试得到每一类的p、r、map值? #58

Closed zxsitu closed 2 years ago

zxsitu commented 2 years ago

作者您好,我正常的进行了训练、稀疏训练、剪枝(采用了prune0),最后微调恢复训练。 我想测试微调后的模型性能,使用了您yolov5-v4库里面的test.py,但是出现了以下的报错。

PS F:\PycharmProjects\pythonProject4\yolov5-v4> python test.py --weights test-models/prune0.7/weights/prune_0.7_sewer-yolov5s-sparsity0-last_finetune-best.pt --data data/coco128.yaml --batch-size 1 --img-size 640 --iou-thres 0.5 --task
 val --verbose --name prune_0.7_sewer-yolov5s-sparsity0-last_finetune-best-val
Namespace(augment=False, batch_size=1, conf_thres=0.001, data='data/coco128.yaml', device='', exist_ok=False, img_size=640, iou_thres=0.5, name='prune_0.7_sewer-yolov5s-sparsity0-last_finetune-best-val', project='runs/test', save_conf=
False, save_hybrid=False, save_json=False, save_txt=False, single_cls=False, task='val', verbose=True, weights=['test-models/prune0.7/weights/prune_0.7_sewer-yolov5s-sparsity0-last_finetune-best.pt'])
Using torch 1.7.1 CUDA:0 (Quadro P5000, 16384.0MB)

Traceback (most recent call last):
  File "test.py", line 566, in <module>
    test(opt.data,
  File "test.py", line 315, in test
    imgsz = check_img_size(imgsz, s=model.stride.max())  # check img_size
  File "D:\Anaconda3\envs\yolo_env\lib\site-packages\torch\nn\modules\module.py", line 778, in __getattr__
    raise ModuleAttributeError("'{}' object has no attribute '{}'".format(
torch.nn.modules.module.ModuleAttributeError: 'Darknet' object has no attribute 'stride'

然后我看回来您yolov5_prune库里面也有一个test.py,于是乎我也试着用一下看能不能得到微调后的模型数据,结果也报错了。(而且没有--task val/test,--verbose也没有)

PS F:\PycharmProjects\pythonProject4\yolov5-v4-prune> python test.py --cfg test-models/prune0.7/cfg/prune_0.7_yolov5s_v4.cfg --weights test-models/prune0.7/weights/prune_0.7_sewer-yolov5s-sparsity0-last_finetune-best.pt --data data/coc
o_128img.data --batch-size 1 --img-size 640 --iou-thres 0.5
Namespace(batch_size=1, cfg='test-models/prune0.7/cfg/prune_0.7_yolov5s_v4.cfg', conf_thres=0.001, data='data/coco_128img.data', device='', img_size=640, iou_thres=0.5, nms_thres=0.5, save_json=False, weights='test-models/prune0.7/weig
hts/prune_0.7_sewer-yolov5s-sparsity0-last_finetune-best.pt')
Using CUDA device0 _CudaDeviceProperties(name='Quadro P5000', total_memory=16384MB)

Traceback (most recent call last):
  File "test.py", line 324, in <module>
    test(opt.cfg,
  File "test.py", line 140, in test
    model.load_state_dict(torch.load(weights, map_location=device)['model'])
  File "D:\Anaconda3\envs\yolo_env\lib\site-packages\torch\nn\modules\module.py", line 1025, in load_state_dict
    state_dict = state_dict.copy()
  File "D:\Anaconda3\envs\yolo_env\lib\site-packages\torch\nn\modules\module.py", line 778, in __getattr__
    raise ModuleAttributeError("'{}' object has no attribute '{}'".format(
torch.nn.modules.module.ModuleAttributeError: 'Darknet' object has no attribute 'copy'

请问作者您有一份可以测试剪枝后和微调后模型的代码?感谢。

ZJU-lishuang commented 2 years ago

没有准备能测试得到每一类p,r,map的函数。 建议保存检测结果,单独测试模型性能。

zxsitu commented 2 years ago

@ZJU-lishuang 明白,还有一个小问题。请问怎样得到微调后模型的推理时间inference time呢?就是等于剪枝后的模型推理时间吗?

ZJU-lishuang commented 2 years ago

微调并不会改变模型推理的时间

zxsitu commented 2 years ago

谢谢,清楚了。

ZJU-lishuang commented 2 years ago

@ilem777 前面你跑起来都没问题吗

ZJU-lishuang commented 2 years ago

@ilem777 没碰到其它issue中那些问题吗

zxsitu commented 2 years ago

@ZJU-lishuang 我国庆时运行您的代码没问题的,包括正常训练、稀疏、剪枝以及恢复训练,但是精度有点低。因此我修改了我的数据集打算下周再重新过一次流程,目前遇到的问题是剪枝时报错ValueError: only one element tensors can be converted to Python scalars您在其他issue中解释是稀疏不到位,因此我目前将稀疏训练300轮增到500轮再试一次。python train_sparsity.py --weights runs/train/sewer-yolov5s/weights/last.pt --data data/coco128.yaml --cfg models/yolov5s.yaml --hyp data/hyp.scratch.yaml --img-size 640 --batch 8 --epochs 500 -sr --s 0.001 --prune 1 --name sewer-yolov5s-sparsity1

ZJU-lishuang commented 2 years ago

恩,这个是模型参数最小值不唯一,就是说有多个最小值,即最低点,表示稀疏不到位,可能是数据集过小(实验过用几十张图片做训练集,一直稀疏不下去)或者稀疏不到位(训练只跑个十几次就剪枝,也报这个错误)

zxsitu commented 2 years ago

清楚,明白了,谢谢您的解释。