Closed johnran103 closed 1 year ago
Hello, please provide me with the mmdetection training log. By the way, the checkpoints has been updated.
20230329_125746.log Hello, this is my training log. Thank you !
By the way, these two result come from your checkpoint. First, test with defualt coco scripts. Second, test with ignored region.
20230329_125746.log Hello, this is my training log. Thank you !
it seems the training phase works fine, and the post-process also works in my computer
so you could check the dataset format following UFPMP-Det and check the official val script or check out the path in https://github.com/Cuogeihong/CEASC/blob/6546f7bb945b6e4579b5b54574b2fb4f417be2e5/tools/json_to_txt.py#L14
By the way, it seems that all your classes' AP are lower than mine (using my checkpoint):
class1: Average Precision (AP) @[ IoU=0.50:0.95 | maxDets=500 ] = 25.02%.
class2: Average Precision (AP) @[ IoU=0.50:0.95 | maxDets=500 ] = 15.54%.
class3: Average Precision (AP) @[ IoU=0.50:0.95 | maxDets=500 ] = 11.45%.
class4: Average Precision (AP) @[ IoU=0.50:0.95 | maxDets=500 ] = 59.02%.
class5: Average Precision (AP) @[ IoU=0.50:0.95 | maxDets=500 ] = 37.53%.
class6: Average Precision (AP) @[ IoU=0.50:0.95 | maxDets=500 ] = 28.76%.
class7: Average Precision (AP) @[ IoU=0.50:0.95 | maxDets=500 ] = 19.67%.
class8: Average Precision (AP) @[ IoU=0.50:0.95 | maxDets=500 ] = 11.65%.
class9: Average Precision (AP) @[ IoU=0.50:0.95 | maxDets=500 ] = 45.94%.
class10: Average Precision (AP) @[ IoU=0.50:0.95 | maxDets=500 ] = 25.61%.
Evaluation Completed. The peformance of the detector is presented as follows.
Average Precision (AP) @[ IoU=0.50:0.95 | maxDets=500 ] = 28.37%.
By the way, these two result come from your checkpoint. First, test with defualt coco scripts. Second, test with ignored region.
Also, here is my test with defualt coco scripts. I think it is probably because of the wrong dataset format
Here is my pipeline to prepare & test visdrone dataset:
I tested your ceasc model. Which is fine.
I have fixed a bug in gfl baseline, you can test it again. By the way, matlab test results not changed while coco changed seems strange, maybe you used a wrong pred_txt/ folder for test?
Ufter using your fixed code and checkpoint, I can get your performance now.
By the way, these two result come from your checkpoint. First, test with defualt coco scripts. Second, test with ignored region.
Hello, I am also reproducing the code ,could tell me how to test with ignored region,please!
Use the official MATLAB code provided by the VisDrone team.
---- Replied Message ---- | From | @.> | | Date | 10/13/2023 21:43 | | To | @.> | | Cc | @.>@.> | | Subject | Re: [Cuogeihong/CEASC] GFL v1 reimplementation performance question. (Issue #2) |
By the way, these two result come from your checkpoint. First, test with defualt coco scripts. Second, test with ignored region.
Hello, I am also reproducing the code ,could tell me how to test with ignored region,please!
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
Use the official MATLAB code provided by the VisDrone team. … ---- Replied Message ---- | From | @.> | | Date | 10/13/2023 21:43 | | To | @.> | | Cc | @.>@.> | | Subject | Re: [Cuogeihong/CEASC] GFL v1 reimplementation performance question. (Issue #2) | By the way, these two result come from your checkpoint. First, test with defualt coco scripts. Second, test with ignored region. Hello, I am also reproducing the code ,could tell me how to test with ignored region,please! — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***> Got it,thank you very much!
Here is my pipeline to prepare & test visdrone dataset:
- Download visdrone dataset from from its official website.
- Using the script from UFPMP-Det python UFPMP-Det-Tools/build_dataset/VisDrone2COCO.py xxx xxx xxx/instances_UAVval_v1.json
- Test. CUDA_VISIBLE_DEVICES=1 python ./tools/test.py ./configs/UAV/baseline_gfl_res18_visdrone.py ./epoch_15.pth --eval bbox --out ./result.pkl python tools/vis_pkl.py --pkl_pathname ./result.pkl --json_pathname ./result.json python tools/json_to_txt.py --json_pathname ./result.json matlab -r evalDET The defualt coco results are as follow. But matlab test results are not changed. I encountered the same problem as you. Were you able to achieve results similar to the author's in your subsequent training? After processing the data as you mentioned above, the results I trained myself had an mAP of around 0.25 when validated on mmdet.
I used your config baseline_gfl_res18_visdrone.py without change. The result is below(tested with matlab 2021a).
which is far below your paper result AP 28.4, AP50 50.0.
What is the possible reseason? Could you give me your checkpoint or possible explanation, please!