Closed guoxuerui closed 3 years ago
Yes, Panoptic-DeepLab uses register_coco_panoptic
on the COCO dataset.
Hi bowen, we have tried register_coco_panoptic
but still get some problem, the error message is about the 'stuff_dataset_id_to_contiguous_id' in load_coco_panoptic_json
.
In tutorial of register_coco_panoptic
, the augment sem_seg_root (none): not used, to be consistent with register_coco_panoptic_separated made us confused, the doubt was whether the 'stuff_dataset_id_to_contiguous_id' follows the register_coco_panopic_separated
? which is assign "things" a semantic id of 0 and all semantic categories have ids in contiguous range [1, #stuff_categories]. So far we have known that load coco json
helps add “thing_dataset_id_to_contiguous_id” to the metadata but has no clue about how the stuff_id transforms by register_coco_panoptic
.
At the mean time, we checked our custom dataset .json, and the annotation format was exact the same as panoptic_train2017.json.Hope you could help with fingure out the data register problem. Thanks in advance.
Forget to mention that our custom dataset do have new categories compare to standard COCO dataset, and found similar complain about dataset register here
No, stuff_dataset_id_to_contiguous_id
does not follow register_coco_panopic_separated
. You can think of it as a mapping from evaluation_id to train_id as used in the CItyscapes dataset.
For example, the category ids you annotated could be "3, 8, 9, 20", but you will need to map them to "0, 1, 2, 3" for training. If "3" and "20" are stuff, the stuff_dataset_id_to_contiguous_id
will be a mapping from 3 to 0 and 20 to 3. It is used to identify which classes are thing and which are stuff.
Thanks for the explanation and I think we did them in the correct way. I guess the problem might be the custom dateset we plan to train on do have some categories that didn't involved in COCO dateset. May I ask if it's still available to train Panoptic-DeepLab on this kind of "custom datasets"?
Yes, you can train Panoptic-DeepLab with any custom dataset
@guoxuerui Did you succeed in training panoptic deeplab with a custom dataset? Did you use register_coco_panopic_separated or register_coco_panoptic?
Correct me if I am wrong:
Hi, may I ask whether the register_coco_panoptic works for Panoptic-deeplab? We now have a costom dataset with COCO panoptic annotation format and want to train Panoptic-deeplab on it