IDEA-Research / MaskDINO

[CVPR 2023] Official implementation of the paper "Mask DINO: Towards A Unified Transformer-based Framework for Object Detection and Segmentation"
Apache License 2.0
1.11k stars 100 forks source link

I registry My own dataset in coco format, but meet following problems. #64

Closed likaiucas closed 1 year ago

likaiucas commented 1 year ago

[03/29 02:28:52 d2.engine.hooks]: Overall training speed: 7 iterations in 0:00:08 (1.1956 s / it) [03/29 02:28:52 d2.engine.hooks]: Total training time: 0:00:08 (0:00:00 on hooks) [03/29 02:28:52 d2.utils.events]: eta: 5 days, 5:31:17 iter: 9 total_loss: 2190 loss_ce: 175.5 loss_mask: 2.751 loss_dice: 4.977 loss_bbox: 0.9077 loss_giou: 2.279 loss_ce_dn: 9.75 loss_mask_dn: 2.915 loss_dice_dn: 4.972 loss_bbox_dn: 0.2105 loss_giou_dn: 0.8641 loss_ce_0: 179.3 loss_mask_0: 2.256 loss_dice_0: 4.947 loss_bbox_0: 0.8538 loss_giou_0: 2.25 loss_ce_dn_0: 7.359 loss_mask_dn_0: 3.191 loss_dice_dn_0: 4.973 loss_bbox_dn_0: 0.2105 loss_giou_dn_0: 0.8641 loss_ce_1: 146.7 loss_mask_1: 2.531 loss_dice_1: 4.953 loss_bbox_1: 0.9208 loss_giou_1: 2.264 loss_ce_dn_1: 7.31 loss_mask_dn_1: 2.229 loss_dice_dn_1: 4.966 loss_bbox_dn_1: 0.2105 loss_giou_dn_1: 0.8641 loss_ce_2: 129.2 loss_mask_2: 3.127 loss_dice_2: 4.966 loss_bbox_2: 0.9533 loss_giou_2: 2.354 loss_ce_dn_2: 6.061 loss_mask_dn_2: 3.791 loss_dice_dn_2: 4.962 loss_bbox_dn_2: 0.2105 loss_giou_dn_2: 0.8641 loss_ce_3: 145.2 loss_mask_3: 3.077 loss_dice_3: 4.954 loss_bbox_3: 0.9007 loss_giou_3: 2.346 loss_ce_dn_3: 8.014 loss_mask_dn_3: 3.515 loss_dice_dn_3: 4.957 loss_bbox_dn_3: 0.2105 loss_giou_dn_3: 0.8641 loss_ce_4: 153.6 loss_mask_4: 3.383 loss_dice_4: 4.967 loss_bbox_4: 0.8383 loss_giou_4: 2.387 loss_ce_dn_4: 8.354 loss_mask_dn_4: 3.759 loss_dice_dn_4: 4.962 loss_bbox_dn_4: 0.2105 loss_giou_dn_4: 0.8641 loss_ce_5: 137.3 loss_mask_5: 3.356 loss_dice_5: 4.969 loss_bbox_5: 0.9351 loss_giou_5: 2.291 loss_ce_dn_5: 8.322 loss_mask_dn_5: 3.792 loss_dice_dn_5: 4.961 loss_bbox_dn_5: 0.2105 loss_giou_dn_5: 0.8641 loss_ce_6: 200.1 loss_mask_6: 3.41 loss_dice_6: 4.969 loss_bbox_6: 0.9624 loss_giou_6: 2.337 loss_ce_dn_6: 11.08 loss_mask_dn_6: 3.544 loss_dice_dn_6: 4.971 loss_bbox_dn_6: 0.2105 loss_giou_dn_6: 0.8641 loss_ce_7: 218.1 loss_mask_7: 3.035 loss_dice_7: 4.953 loss_bbox_7: 0.882 loss_giou_7: 2.24 loss_ce_dn_7: 12.13 loss_mask_dn_7: 3.038 loss_dice_dn_7: 4.948 loss_bbox_dn_7: 0.2105 loss_giou_dn_7: 0.8641 loss_ce_8: 215.2 loss_mask_8: 2.38 loss_dice_8: 4.943 loss_bbox_8: 0.8694 loss_giou_8: 2.286 loss_ce_dn_8: 11.06 loss_mask_dn_8: 2.694 loss_dice_dn_8: 4.946 loss_bbox_dn_8: 0.2105 loss_giou_dn_8: 0.8641 loss_ce_interm: 179.3 loss_mask_interm: 2.256 loss_dice_interm: 4.946 loss_bbox_interm: 0.8538 loss_giou_interm: 2.25 time: 1.1954 data_time: 0.0533 lr: 0.0001 max_mem: 6515M Traceback (most recent call last): File "/config_data/code/MaskDINO/train.py", line 441, in args=(args,), File "/opt/conda/lib/python3.7/site-packages/detectron2/engine/launch.py", line 82, in launch main_func(*args) File "/config_data/code/MaskDINO/train.py", line 422, in main return trainer.train() File "/opt/conda/lib/python3.7/site-packages/detectron2/engine/defaults.py", line 484, in train super().train(self.start_iter, self.max_iter) File "/opt/conda/lib/python3.7/site-packages/detectron2/engine/train_loop.py", line 149, in train self.run_step() File "/opt/conda/lib/python3.7/site-packages/detectron2/engine/defaults.py", line 494, in run_step self._trainer.run_step() File "/opt/conda/lib/python3.7/site-packages/detectron2/engine/train_loop.py", line 391, in run_step data = next(self._data_loader_iter) File "/opt/conda/lib/python3.7/site-packages/detectron2/data/common.py", line 234, in iter for d in self.dataset: File "/opt/conda/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 521, in next data = self._next_data() File "/opt/conda/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1203, in _next_data return self._process_data(data) File "/opt/conda/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1229, in _process_data data.reraise() File "/opt/conda/lib/python3.7/site-packages/torch/_utils.py", line 425, in reraise raise self.exc_type(msg) ValueError: Caught ValueError in DataLoader worker process 1. Original Traceback (most recent call last): File "/opt/conda/lib/python3.7/site-packages/detectron2/data/detection_utils.py", line 404, in annotations_to_instances masks = PolygonMasks(segms) File "/opt/conda/lib/python3.7/site-packages/detectron2/structures/masks.py", line 308, in init process_polygons(polygons_per_instance) for polygons_per_instance in polygons File "/opt/conda/lib/python3.7/site-packages/detectron2/structures/masks.py", line 308, in process_polygons(polygons_per_instance) for polygons_per_instance in polygons File "/opt/conda/lib/python3.7/site-packages/detectron2/structures/masks.py", line 298, in process_polygons "Got '{}' instead.".format(type(polygons_per_instance)) ValueError: Cannot create polygons: Expect a list of polygons per instance. Got '<class 'numpy.ndarray'>' instead.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/opt/conda/lib/python3.7/site-packages/torch/utils/data/_utils/worker.py", line 287, in _worker_loop data = fetcher.fetch(index) File "/opt/conda/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 28, in fetch data.append(next(self.dataset_iter)) File "/opt/conda/lib/python3.7/site-packages/detectron2/data/common.py", line 201, in iter yield self.dataset[idx] File "/opt/conda/lib/python3.7/site-packages/detectron2/data/common.py", line 90, in getitem data = self._map_func(self._dataset[cur_idx]) File "/opt/conda/lib/python3.7/site-packages/detectron2/utils/serialize.py", line 26, in call return self._obj(*args, **kwargs) File "/config_data/code/MaskDINO/maskdino/data/dataset_mappers/coco_instance_new_baseline_dataset_mapper.py", line 171, in call instances = utils.annotations_to_instances(annos, image_shape) File "/opt/conda/lib/python3.7/site-packages/detectron2/data/detection_utils.py", line 408, in annotations_to_instances ) from e ValueError: Failed to use mask_format=='polygon' from the given annotations!

About my dataset

one sample of my annotations as: {'segmentation': [[...]], 'iscrowd': 0, 'image_id': 0, 'category_id': 1, 'bbox': [19, 496, 22, 11], 'area': 232.0, 'id': 0}

FengLi-ust commented 1 year ago

It seems that your mask annotation is not polygon, but a list of numpy arrays. You can check your dataset format to align with COCO.

likaiucas commented 1 year ago

OK, thanks I will check it. I debugged the code. It seems converted in line 164 of maskdino/data/dataset_mappers/coco_instance_new_baseline_dataset_mapper.py

likaiucas commented 1 year ago

I checked coco, it is the polygon format, the problem is in line 164 of maskdino/data/dataset_mappers/coco_instance_new_baseline_dataset_mapper.py, function utils.transform_instance_annotations

likaiucas commented 1 year ago

load it from detectron2

---Original--- From: @.> Date: Sat, Apr 8, 2023 17:39 PM To: @.>; Cc: "LI @.**@.>; Subject: Re: [IDEA-Research/MaskDINO] I registry My own dataset in cocoformat, but meet following problems. (Issue #64)

@likaiucas Hello, I want to follow your step to train my own datasets. Can you provide the function lambda: load_coco_json(json_file, image_root, name)? Thank you!!

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: @.***>