Closed vineetsharma14 closed 1 year ago
Hi @vineetsharma14, thanks for your interest in our work.
You don't need to prepare a custom data_mapper
for validation. You must define a val
split while registering your custom dataset, and everything should work fine. You may take a look at defining and registering of the coco_panoptic
dataset for reference.
Thanks for your response.
I have checked the 2 links for defining and registering of the coco_panoptic dataset, which you had shared for reference.
However, things are not very clear. Do we need to write a custom "register_coco_instance_annos_seg.py" similar to "register_coco_panoptic_annos_semseg.py"?
NOTE : I am using a dataset in COCO format only for instance segmentation.
As mentioned in #17, I have used the register_coco_instance
from detectron2.data.datasets
.
Can you please elaborate on what one needs to do to add validations dataset ?
Thanks !
Do we need to write a custom "register_coco_instance_annos_seg.py" similar to "register_coco_panoptic_annos_semseg.py"?
Hi @vineetsharma14, sorry for not getting back to you earlier. Yes, you need to write a custom register_dataset
file with your train and val splits when using a custom dataset. As you mentioned that your dataset is already in the COCO format and training on instance segmentation, register_coco_panoptic2instance.py
might be a better reference for you. You will need to replace the image and JSON paths with those of your custom dataset.
Something like:
_PREDEFINED_SPLITS_CUSTOM_DATASET = {
"<YOUR_DATASET>_val": (<path-to-images>, <path-to-JSON-file>),
}
def register_instances_cutom_dataset(root):
for key, (image_root, json_file) in _PREDEFINED_SPLITS_CUSTOM_DATASET.items():
# Assume pre-defined datasets live in `./datasets`.
register_coco_instances(
key,
_get_builtin_metadata(<dataset-name>),
os.path.join(root, json_file) if "://" not in json_file else json_file,
os.path.join(root, image_root),
)
Thanks @praeclarumjj3 for the guidance. Really appreciate it !
Hello There,
Fristly, thanks for the sharing the amazing work !
I am using OneFormer for Instance Segmentation Task on custom dataset.
I read the #17 and used the "InstanceCOCOCustomNewBaselineDatasetMapper" from the _instance_coco_custom_datasetmapper.py; and I am able to train the model on my dataset.
I was trying to figure out - if I can get the inference results on the Validation Dataset periodically, says every 100 iterations.
I modified the cfg as below -
cfg.DATASETS.TEST = "CustomInstSegVAL"
cfg.TEST.EVAL_PERIOD = 100
I created a overridden version of the
build_test_loader
function using the InstanceCOCOCustomNewBaselineDatasetMapper as below -def build_test_loader(cos, cfg, dataset_name):
val_ampper = InstanceCOCOCustomNewBaselineDatasetMapper(cfg, is_train = True)
return build_detection_test_loader(DatasetCatalog.get('CustomInstSegVal'), mapper=val_mapper)
NOTE: if I set
is_train = False
inval_mapper = InstanceCOCOCustomNewBaselineDatasetMapper(cfg, is_train = True)
, then it throws AssertionError from thebuild_transform_gen function
of InstanceCOCOCustomNewBaselineDatasetMapper.If I set
is_train = True
inval_mapper = InstanceCOCOCustomNewBaselineDatasetMapper(cfg, is_train = True)
, then it throws AssertionError from pycocotools/coco.py > stating "AssertionError: Results do not correspond to current coco set"Can you please guide me as to how we can use the Validation Dataset to test the models performance during the training.
Thanks !