I tried the command python train_net.py --resume --num-gpus $n --config-file configs/semantic_sam_reproduce_sam_swinL.yaml COCO.TEST.BATCH_SIZE_TOTAL=$n SAM.TEST.BATCH_SIZE_TOTAL=$n SAM.TRAIN.BATCH_SIZE_TOTAL=$n but found that there are some training configs missing such as MIN_SIZE_TRAIN and CROP.TYPE. After I add these configs, there are still some errors, like
File "/home/ma-user/work/detectron2-xyz-main/detectron2/data/dataset_mapper.py", line 154, in __call__
image = utils.read_image(dataset_dict["file_name"], format=self.image_format)
KeyError: 'file_name'
when I use the sam json mapper.
Could you please give a detailed guide to show how to run this training experiment?
I tried the command
python train_net.py --resume --num-gpus $n --config-file configs/semantic_sam_reproduce_sam_swinL.yaml COCO.TEST.BATCH_SIZE_TOTAL=$n SAM.TEST.BATCH_SIZE_TOTAL=$n SAM.TRAIN.BATCH_SIZE_TOTAL=$n
but found that there are some training configs missing such asMIN_SIZE_TRAIN
andCROP.TYPE
. After I add these configs, there are still some errors, likewhen I use the sam json mapper.
Could you please give a detailed guide to show how to run this training experiment?