Open varghesealex90 opened 5 years ago
@varghesealex90 I set the path to test.jpg in line 324 of prune.py as follow:
pruned_model.eval() img_path = "
/test.jpg" #line 324 org_img = cv2.imread(img_path) # BGR img, ratiow, ratioh, padw, padh = letterbox(org_img, new_shape=[img_size,img_size], mode='rect')
and then run : python prune.py , but a new issue occurs as follow:
Traceback (most recent call last): File "prune.py", line 391, in
opt.perlayer_ratio, File "prune.py", line 328, in test img, ratiow, ratioh, padw, padh = letterbox(org_img, new_shape=[img_size,img_size], mode='rect') ValueError: not enough values to unpack (expected 5, got 4)
In spite of this,I still get a folder with the prune.cfg and prune.pt And I use the generated prune.pt and prune.cfg to test directly in ultralytics/yolov3 as follow:
python test.py --cfg <-path->/prune.cfg --data-cfg <-path-to-data-> --weights <-path->/prune.pt
a issue occurs as follow:
Traceback (most recent call last): File "test.py", line 209, in
opt.save_json File "test.py", line 31, in test model.load_state_dict(torch.load(weights, map_location=device)['model']) KeyError: 'model'
I think there is some error with the generated prune.pt I have tried converted the pruned.pt to pruned.weights but went wrong. How should I do? How should I fix the issue that occurs when I run : python prune.py?
This seems easy.
Convert model.load_state_dict(torch.load(weights, map_location=device)['model']) to model.load_state_dict(torch.load(weights, map_location=device))
I have met the same problem as @wjjouc when running :python prune.py
img, ratiow, ratioh, padw, padh = letterbox(org_img, new_shape=[img_size,img_size], mode='rect') ValueError: not enough values to unpack (expected 5, got 4)
And after modifying the prune.py as follow:
Convert model.load_state_dict(torch.load(weights, map_location=device)['model']) to model.load_state_dict(torch.load(weights, map_location=device))
a new problem occurs as follow:
size mismatch for module_list.94.conv_94.weight: copying a param with shape torch.Size([256, 755, 1, 1]) from checkpoint, the shape in current model is torch.Size([256, 768, 1, 1]). size mismatch for module_list.112.conv_112.weight: copying a param with shape torch.Size([128, 373, 1, 1]) from checkpoint, the shape in current model is torch.Size([128, 384, 1, 1]).
@varghesealex90
Have you ever met any issue when running: python prune.py?
I use coco data to Sparsity Training and goes well , but meet problem when do Channel Prunning. I wonder if the format of your own data is similar to coco data ?
3.Do you know the format of the Visdrone data that the author provides? I notice one of the txt in the annotations file, such as 0000002_00005_d_0000014.txt, is like that:
684,8,273,116,0,0,0,0 406,119,265,70,0,0,0,0 255,22,119,128,0,0,0,0
4.Could I ignore the issue that occur when running: python prune.py as follow?
File "prune.py", line 328, in test img, ratiow, ratioh, padw, padh = letterbox(org_img, new_shape=[img_size,img_size], mode='rect') ValueError: not enough values to unpack (expected 5, got 4)
But when I use the generated prune.pt and prune.cfg to test directly in ultralytics/yolov3 , new problem occurs again .It seems the prune.pt is wrong and could not be used to test.
@varghesealex90 I try to download the original yolov3.weights from https://pjreddie.com/media/files/yolov3.weights And then run prune.py using the original yolov3.weights and yolov3.cfg as follow:
python prune.py --cfg yolov3.cfg --weights yolov3.weights
Next run test.py using the generated prune.pt and prune.cfg in in ultralytics/yolov3 as follow:
python test.py --cfg prune.cfg --weights prune.pt
But finally the mAP is 0%.
@nyj-ocean I faced the same issue. I too got mAP = 0% after pruning. However, I fine-tuned the model for about 30-40 k iteration with the pruned weights and cfg and I was able to restore the mAP value equivalent to unpruned network
Regarding the model, modify the code with try and except condition.
try: model.load_state_dict(torch.load(weights, map_location=device)['model']) except: model.load_state_dict(torch.load(weights, map_location=device))
@wjjouc @varghesealex90 when i run prunr.py with follow parameter: i get this error,how can i do?could you help me?thank you very much.
Traceback (most recent call last):
File "D:/LJXpycharmproject/yolov3-slimyolov3/prune.py", line 398, in
@broliao I never encountered this error. Seems like shape mismatch.
@varghesealex90 Can you explain me how you make sparsity training step? I cant see train_drone.py file. Thank you!
The result after running prune.py is a cfg file and a pytorch model. I find that the mAP of the resultant model is 0 % before performing fine-tuning. Is this normal??