JosephKJ / OWOD

(CVPR 2021 Oral) Open World Object Detection
https://josephkj.in
Apache License 2.0
1.02k stars 153 forks source link

Confusion in t2_final / t2_ft and WI #67

Open Hrren opened 2 years ago

Hrren commented 2 years ago

Thank you for your reply @JosephKJ Here are two questions I still in trouble: When I finish task2 training, is the model_final.pth in t2_final works better than that in t2_ft? I use the model_final.pth in these two folders , and the test result is different, why this happen? and WI in paper refers to the "Wilderness Impact:{0.8:{50:xxxxxx}}" or others?

Originally posted by @Hrren in https://github.com/JosephKJ/OWOD/issues/60#issuecomment-927237881

luckychay commented 2 years ago

I came into the same issuse, I used the models in t1 and _t1_final respectively and got different results on a single image, it seemed that only the model in t1_final can detect unknowns while in #8 , the author said model in t1_ was used in evaluation. I am also confused.

output_000000044877_t1 result on t1 output_000000044877_t1_final result on _t1_final_

Ailice0 commented 2 years ago

Hi, @luckychay Could you share a visual tutorial? I use the # metrics.json file provided by the author, and the command is as follows: python /home/jar/CL/OWOD-master/tools/visualize_json_results.py --input /home/jar/CL/OWOD-master/models_backup/t1_only_thresh/metrics.json --output /home/jar/CL/OWOD-master/display --dataset t1_voc_coco_2007_train The following error is displayed: eg Is there any error somewhere? Thanks!

MartinaProgrammer commented 2 years ago

@JosephKJ@luckychay Hello, I would like to ask how the WI value in the result table is calculated. In the log.txt:d2.evaluation.pascal_voc_evaluation INFO: Wilderness Impact: {0.1: {50: 0.017806111233238074}, 0.2: {50: 0.028345724907063198}, 0.3: {50: 0.038800776824728926}, 0.4: {50: 0.0477657935285054}, 0.5: {50: 0.046036375796533344}, 0.6: {50: 0.03933635758017955}, 0.7: {50: 0.04479009962816838}, 0.8: {50: 0.04795684951624818}, 0.9: {50: 0.04991899036411699}}. Thanks!