SHI-Labs / OneFormer

OneFormer: One Transformer to Rule Universal Image Segmentation, arxiv 2022 / CVPR 2023
https://praeclarumjj3.github.io/oneformer
MIT License
1.41k stars 128 forks source link

How can we add Validation Dataset for periodic evaluation during training ? #31

Closed vineetsharma14 closed 1 year ago

vineetsharma14 commented 1 year ago

Hello There,

Fristly, thanks for the sharing the amazing work !

I am using OneFormer for Instance Segmentation Task on custom dataset.

I read the #17 and used the "InstanceCOCOCustomNewBaselineDatasetMapper" from the _instance_coco_custom_datasetmapper.py; and I am able to train the model on my dataset.

I was trying to figure out - if I can get the inference results on the Validation Dataset periodically, says every 100 iterations.

I modified the cfg as below -

cfg.DATASETS.TEST = "CustomInstSegVAL" cfg.TEST.EVAL_PERIOD = 100

I created a overridden version of the build_test_loader function using the InstanceCOCOCustomNewBaselineDatasetMapper as below -

def build_test_loader(cos, cfg, dataset_name): val_ampper = InstanceCOCOCustomNewBaselineDatasetMapper(cfg, is_train = True) return build_detection_test_loader(DatasetCatalog.get('CustomInstSegVal'), mapper=val_mapper)

NOTE: if I set is_train = False in val_mapper = InstanceCOCOCustomNewBaselineDatasetMapper(cfg, is_train = True), then it throws AssertionError from the build_transform_gen function of InstanceCOCOCustomNewBaselineDatasetMapper.

If I set is_train = True in val_mapper = InstanceCOCOCustomNewBaselineDatasetMapper(cfg, is_train = True), then it throws AssertionError from pycocotools/coco.py > stating "AssertionError: Results do not correspond to current coco set"

Can you please guide me as to how we can use the Validation Dataset to test the models performance during the training.

Thanks !

praeclarumjj3 commented 1 year ago

Hi @vineetsharma14, thanks for your interest in our work.

You don't need to prepare a custom data_mapper for validation. You must define a val split while registering your custom dataset, and everything should work fine. You may take a look at defining and registering of the coco_panoptic dataset for reference.

vineetsharma14 commented 1 year ago

Thanks for your response.

I have checked the 2 links for defining and registering of the coco_panoptic dataset, which you had shared for reference.

However, things are not very clear. Do we need to write a custom "register_coco_instance_annos_seg.py" similar to "register_coco_panoptic_annos_semseg.py"?

NOTE : I am using a dataset in COCO format only for instance segmentation.

As mentioned in #17, I have used the register_coco_instance from detectron2.data.datasets.

Can you please elaborate on what one needs to do to add validations dataset ?

Thanks !

praeclarumjj3 commented 1 year ago

Do we need to write a custom "register_coco_instance_annos_seg.py" similar to "register_coco_panoptic_annos_semseg.py"?

Hi @vineetsharma14, sorry for not getting back to you earlier. Yes, you need to write a custom register_dataset file with your train and val splits when using a custom dataset. As you mentioned that your dataset is already in the COCO format and training on instance segmentation, register_coco_panoptic2instance.py might be a better reference for you. You will need to replace the image and JSON paths with those of your custom dataset.

https://github.com/SHI-Labs/OneFormer/blob/761189909f392a110a4ead574d85ed3a17fbc8a7/oneformer/data/datasets/register_coco_panoptic2instance.py#L27-L29

Something like:

_PREDEFINED_SPLITS_CUSTOM_DATASET = {
    "<YOUR_DATASET>_val": (<path-to-images>, <path-to-JSON-file>),
}

def register_instances_cutom_dataset(root):
    for key, (image_root, json_file) in _PREDEFINED_SPLITS_CUSTOM_DATASET.items():
        # Assume pre-defined datasets live in `./datasets`.
        register_coco_instances(
            key,
            _get_builtin_metadata(<dataset-name>),
            os.path.join(root, json_file) if "://" not in json_file else json_file,
            os.path.join(root, image_root),
        )
vineetsharma14 commented 1 year ago

Thanks @praeclarumjj3 for the guidance. Really appreciate it !