I'm trying to reproduce the object detection result of the pretrained model X101-FPN from the model zoo, on COCO 2017 validation dataset. Below is the code that I used:
from detectron2.utils.logger import setup_logger
setup_logger()
# import some common libraries
import numpy as np
import cv2
import random
# import some common detectron2 utilities
from detectron2.engine import DefaultPredictor
from detectron2.config import get_cfg
from detectron2.utils.visualizer import Visualizer
from detectron2.data import MetadataCatalog
import matplotlib.pyplot as plt
import os
os.environ["PYTHONBREAKPOINT"]="pdb.set_trace"
from detectron2.data.datasets import register_coco_instances
register_coco_instances("COCO2017_val", {}, "./datasets/COCO2017/annotations/instances_val2017.json", "./datasets/COCO2017/images/val2017")
cfg = get_cfg()
cfg.merge_from_file("./detectron2_repo/configs/COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x.yaml")
cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.5 # set threshold for this model
cfg.MODEL.WEIGHTS = "detectron2://COCO-Detection/faster_rcnn_X_101_32x8d_FPN_3x/139173657/model_final_68b088.pkl"
cfg.DATASETS.TEST = ("COCO2017_val", )
predictor = DefaultPredictor(cfg)
from detectron2.evaluation import COCOEvaluator, inference_on_dataset
from detectron2.data import build_detection_test_loader
evaluator = COCOEvaluator("COCO2017_val", cfg, False, output_dir="./output/")
val_loader = build_detection_test_loader(cfg, "COCO2017_val")
result = inference_on_dataset(predictor.model, val_loader, evaluator)
If I read this result correctly, this means it only got 39.6 box AP, rather than 43.0 as reported in the MODEL ZOO page. The 2017 validation dataset was downloaded from COCO homepage. I couldn't find any document on the SCORE_THRESH_TEST config, so I left it as default (0.5)
I'm trying to reproduce the object detection result of the pretrained model X101-FPN from the model zoo, on COCO 2017 validation dataset. Below is the code that I used:
And here is the result that I got:
If I read this result correctly, this means it only got 39.6 box AP, rather than 43.0 as reported in the MODEL ZOO page. The 2017 validation dataset was downloaded from COCO homepage. I couldn't find any document on the SCORE_THRESH_TEST config, so I left it as default (0.5)
My environment setup:
Please tell me if I'm missing something :) Thank you.