PengyiZhang / SlimYOLOv3

This page is for the SlimYOLOv3: Narrower, Faster and Better for UAV Real-Time Applications
1.14k stars 320 forks source link

Regarding mAP values before and after pruning #38

Open varghesealex90 opened 5 years ago

varghesealex90 commented 5 years ago

The result after running prune.py is a cfg file and a pytorch model. I find that the mAP of the resultant model is 0 % before performing fine-tuning. Is this normal??

wjjouc commented 5 years ago

@varghesealex90 I set the path to test.jpg in line 324 of prune.py as follow:

pruned_model.eval() img_path = "/test.jpg" #line 324 org_img = cv2.imread(img_path) # BGR img, ratiow, ratioh, padw, padh = letterbox(org_img, new_shape=[img_size,img_size], mode='rect')

and then run : python prune.py , but a new issue occurs as follow:

Traceback (most recent call last): File "prune.py", line 391, in opt.perlayer_ratio, File "prune.py", line 328, in test img, ratiow, ratioh, padw, padh = letterbox(org_img, new_shape=[img_size,img_size], mode='rect') ValueError: not enough values to unpack (expected 5, got 4)

In spite of this,I still get a folder with the prune.cfg and prune.pt And I use the generated prune.pt and prune.cfg to test directly in ultralytics/yolov3 as follow:

python test.py --cfg <-path->/prune.cfg --data-cfg <-path-to-data-> --weights <-path->/prune.pt

a issue occurs as follow:

Traceback (most recent call last): File "test.py", line 209, in opt.save_json File "test.py", line 31, in test model.load_state_dict(torch.load(weights, map_location=device)['model']) KeyError: 'model'

I think there is some error with the generated prune.pt I have tried converted the pruned.pt to pruned.weights but went wrong. How should I do? How should I fix the issue that occurs when I run : python prune.py?

varghesealex90 commented 5 years ago

This seems easy.

Convert model.load_state_dict(torch.load(weights, map_location=device)['model']) to model.load_state_dict(torch.load(weights, map_location=device))

nyj-ocean commented 5 years ago

I have met the same problem as @wjjouc when running :python prune.py

img, ratiow, ratioh, padw, padh = letterbox(org_img, new_shape=[img_size,img_size], mode='rect') ValueError: not enough values to unpack (expected 5, got 4)

And after modifying the prune.py as follow:

Convert model.load_state_dict(torch.load(weights, map_location=device)['model']) to model.load_state_dict(torch.load(weights, map_location=device))

a new problem occurs as follow:

size mismatch for module_list.94.conv_94.weight: copying a param with shape torch.Size([256, 755, 1, 1]) from checkpoint, the shape in current model is torch.Size([256, 768, 1, 1]). size mismatch for module_list.112.conv_112.weight: copying a param with shape torch.Size([128, 373, 1, 1]) from checkpoint, the shape in current model is torch.Size([128, 384, 1, 1]).

@varghesealex90

  1. Have you ever met any issue when running: python prune.py?

  2. I use coco data to Sparsity Training and goes well , but meet problem when do Channel Prunning. I wonder if the format of your own data is similar to coco data ?

3.Do you know the format of the Visdrone data that the author provides? I notice one of the txt in the annotations file, such as 0000002_00005_d_0000014.txt, is like that:

684,8,273,116,0,0,0,0 406,119,265,70,0,0,0,0 255,22,119,128,0,0,0,0

4.Could I ignore the issue that occur when running: python prune.py as follow?

File "prune.py", line 328, in test img, ratiow, ratioh, padw, padh = letterbox(org_img, new_shape=[img_size,img_size], mode='rect') ValueError: not enough values to unpack (expected 5, got 4)

But when I use the generated prune.pt and prune.cfg to test directly in ultralytics/yolov3 , new problem occurs again .It seems the prune.pt is wrong and could not be used to test.

nyj-ocean commented 5 years ago

@varghesealex90 I try to download the original yolov3.weights from https://pjreddie.com/media/files/yolov3.weights And then run prune.py using the original yolov3.weights and yolov3.cfg as follow:

python prune.py --cfg yolov3.cfg --weights yolov3.weights

Next run test.py using the generated prune.pt and prune.cfg in in ultralytics/yolov3 as follow:

python test.py --cfg prune.cfg --weights prune.pt

But finally the mAP is 0%.

varghesealex90 commented 5 years ago

@nyj-ocean I faced the same issue. I too got mAP = 0% after pruning. However, I fine-tuned the model for about 30-40 k iteration with the pruned weights and cfg and I was able to restore the mAP value equivalent to unpruned network

varghesealex90 commented 5 years ago

Regarding the model, modify the code with try and except condition.

try: model.load_state_dict(torch.load(weights, map_location=device)['model']) except: model.load_state_dict(torch.load(weights, map_location=device))

broliao commented 5 years ago

@wjjouc @varghesealex90 when i run prunr.py with follow parameter: image i get this error,how can i do?could you help me?thank you very much.

Traceback (most recent call last): File "D:/LJXpycharmproject/yolov3-slimyolov3/prune.py", line 398, in opt.perlayer_ratio, File "D:/LJXpycharmproject/yolov3-slimyolov3/prune.py", line 346, in test inf_out, train_out = pruned_model(imgs) # inference and training outputs File "D:\anaconda\lib\site-packages\torch\nn\modules\module.py", line 493, in call result = self.forward(*input, **kwargs) File "D:\LJXpycharmproject\yolov3-slimyolov3\models.py", line 226, in forward x = torch.cat([layer_outputs[i] for i in layers], 1) RuntimeError: invalid argument 0: Sizes of tensors must match except in dimension 1. Got 20 and 40 in dimension 2 at C:/w/1/s/tmp_conda_3.6_041836/conda/conda-bld/pytorch_1556684464974/work/aten/src\THC/generic/THCTensorMath.cu:71

varghesealex90 commented 5 years ago

@broliao I never encountered this error. Seems like shape mismatch.

cartovarc commented 5 years ago

@varghesealex90 Can you explain me how you make sparsity training step? I cant see train_drone.py file. Thank you!