Open seekFire opened 3 years ago
I think maybe the image names do not follow """ P00000.500 P00000.501000 P00000.50___1500 """ but you may debug to see the details.
Maybe it is not the reason, the val image names after cropping by ImgSplit_multi_process.py are shown as follows:
P0003__0.5__0___0
P0003__1__0___0
P0003__1__0___423
P0003__1__500___0
P0003__1__500___423
P0003__1__547___0
P0003__1__547___423
The configuration in ImgSplit_multi_process.py is as follows:
split = splitbase(r'/media/val',
r'/media/Dota4BBAVectors/val',
gap=100,
subsize=600,
num_process=8
)
split.splitdata(1)
split.splitdata(0.5)
Thus the 458 original val images were split into 17651 patches. I found that when I run the command mentioned above again and again, the Task1_xxx.txt without content became different with previous time...
@yijingru I think I've known the reason. I had canceled the comment of 'BatchNorm2d' in ctxbox_net.py, so the model became unstable. Now, everything goes well. Thank you!
@yijingru I think I've known the reason. I had canceled the comment of 'BatchNorm2d' in ctxbox_net.py, so the model became unstable. Now, everything goes well. Thank you!
I remember uncomment the BatchNorm2d in ctxbox_net.py
will help stable training in #20 , but in this issue, "canceled the comment of 'BatchNorm2d' in ctxbox_net.py, the model became unstable", so should the BN be used or not finally?
@Lg955 Well, because I used the pre-trained model which not contains the BatchNorm2d layer. If you want to use BatchNorm2d, you may need to train a new model from scratch.
@Lg955 Well, because I used the pre-trained model which not contains the BatchNorm2d layer. If you want to use BatchNorm2d, you may need to train a new model from scratch.
Oh, I see, thank U!
@Lg955
The 1st question: Yes, you need to use ImgSplit.py
for the three dataset separately then merge images & labels which come from train set & val set into directory trainval.
The 2nd question: Yes, data in directory trainval is only used for training.
After running the following command, I got 15 Task1_xxx.txt in result_dota and merge_dota separately, but some of them has no content...
python3 main.py --data_dir /media/Dota4BBAVectors/val --resume model_50.pth --conf_thresh 0.1 --dataset dota --phase eval
I use val set of DOTA which has been split into 17651 patches(600*600) as input data. Do you think what's the reason of this phenomenon?