jbwang1997 / OBBDetection

OBBDetection is an oriented object detection library, which is based on MMdetection.
Apache License 2.0
519 stars 111 forks source link

Training on custom dataset. [Error No such file or directory] for loading train_annotation.pkl file #157

Open kumarv987 opened 2 years ago

kumarv987 commented 2 years ago

I'm training on a custom dataset. I followed the steps for custom dataset and created a train_annotation.pkl file. I have saved the images and train_annotation.pkl file in data folder. This data folder is located on the main folder of the repository [OBBDetection/data] and inside data folder looks like this: OBBDetection/data/ --------images/[Location of png files] --------train_annotation.pkl When I run this line "!python tools/train.py configs/obb/faster_rcnn_obb/faster_rcnn_obb_r50_fpn_3x_custom.py", I get the following script with an error: --------------------------------------------------SCRIPT-Starting --------------------------------------------------

2022-05-21 04:49:33,745 - mmdet - INFO - Environment info:

sys.platform: linux Python: 3.7.13 (default, Apr 24 2022, 01:04:09) [GCC 7.5.0] CUDA available: True CUDA_HOME: /usr/local/cuda NVCC: Build cuda_11.1.TC455_06.29190527_0 GPU 0: Tesla T4 GCC: gcc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0 PyTorch: 1.10.0+cu111 PyTorch compiling details: PyTorch built with:

TorchVision: 0.11.0+cu111 OpenCV: 4.1.2 MMCV: 1.5.1 MMDetection: 2.2.0+unknown MMDetection Compiler: GCC 7.5 MMDetection CUDA Compiler: 11.1

2022-05-21 04:49:33,746 - mmdet - INFO - Distributed training: False 2022-05-21 04:49:34,029 - mmdet - INFO - Config: dataset_type = 'CustomDataset' data_root = '/content/drive/MyDrive/Capstone_project_datasets/OBBDetection-master/data/' img_norm_cfg = dict( mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True) train_pipeline = [ dict(type='LoadImageFromFile'), dict( type='LoadOBBAnnotations', with_bbox=True, with_label=True, obb_as_mask=True), dict(type='OBBRandomFlip', h_flip_ratio=0.5, v_flip_ratio=0.5), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict( type='RandomOBBRotate', rotate_after_flip=True, angles=(0, 0), vert_rate=0.5), dict(type='Pad', size_divisor=32), dict(type='Mask2OBB', obb_type='obb'), dict(type='OBBDefaultFormatBundle'), dict( type='OBBCollect', keys=['img', 'gt_bboxes', 'gt_obboxes', 'gt_labels']) ] data = dict( samples_per_gpu=2, workers_per_gpu=4, train=dict( type='CustomDataset', ann_file= '/content/drive/MyDrive/Capstone_project_datasets/OBBDetection-master/data/train_annotation.pkl', img_prefix= '/content/drive/MyDrive/Capstone_project_datasets/OBBDetection-master/data/images/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='LoadOBBAnnotations', with_bbox=True, with_label=True, obb_as_mask=True), dict(type='OBBRandomFlip', h_flip_ratio=0.5, v_flip_ratio=0.5), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict( type='RandomOBBRotate', rotate_after_flip=True, angles=(0, 0), vert_rate=0.5), dict(type='Pad', size_divisor=32), dict(type='Mask2OBB', obb_type='obb'), dict(type='OBBDefaultFormatBundle'), dict( type='OBBCollect', keys=['img', 'gt_bboxes', 'gt_obboxes', 'gt_labels']) ])) evaluation = None optimizer = dict(type='SGD', lr=0.005, momentum=0.9, weight_decay=0.0001) optimizer_config = dict(grad_clip=dict(max_norm=35, norm_type=2)) lr_config = dict( policy='step', warmup='linear', warmup_iters=500, warmup_ratio=0.001, step=[24, 33]) total_epochs = 36 checkpoint_config = dict(interval=3) log_config = dict(interval=50, hooks=[dict(type='TextLoggerHook')]) dist_params = dict(backend='nccl') log_level = 'INFO' load_from = None resume_from = None workflow = [('train', 1)] model = dict( type='FasterRCNNOBB', pretrained='torchvision://resnet50', backbone=dict( type='ResNet', depth=50, num_stages=4, out_indices=(0, 1, 2, 3), frozen_stages=1, norm_cfg=dict(type='BN', requires_grad=True), norm_eval=True, style='pytorch'), neck=dict( type='FPN', in_channels=[256, 512, 1024, 2048], out_channels=256, num_outs=5), rpn_head=dict( type='RPNHead', in_channels=256, feat_channels=256, anchor_generator=dict( type='AnchorGenerator', scales=[8], ratios=[0.5, 1.0, 2.0], strides=[4, 8, 16, 32, 64]), bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[1.0, 1.0, 1.0, 1.0]), loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0), loss_bbox=dict( type='SmoothL1Loss', beta=0.1111111111111111, loss_weight=1.0)), roi_head=dict( type='OBBStandardRoIHead', bbox_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', out_size=7, sample_num=2), out_channels=256, featmap_strides=[4, 8, 16, 32]), bbox_head=dict( type='OBBShared2FCBBoxHead', start_bbox_type='hbb', end_bbox_type='obb', in_channels=256, fc_out_channels=1024, roi_feat_size=7, num_classes=1, bbox_coder=dict( type='HBB2OBBDeltaXYWHTCoder', target_means=[0.0, 0.0, 0.0, 0.0, 0.0], target_stds=[0.1, 0.1, 0.2, 0.2, 0.1]), reg_class_agnostic=True, loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0), loss_bbox=dict(type='SmoothL1Loss', beta=1.0, loss_weight=1.0)))) train_cfg = dict( rpn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.7, neg_iou_thr=0.3, min_pos_iou=0.3, match_low_quality=True, gpu_assign_thr=200, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=256, pos_fraction=0.5, neg_pos_ub=-1, add_gt_as_proposals=False), allowed_border=0, pos_weight=-1, debug=False), rpn_proposal=dict( nms_across_levels=False, nms_pre=2000, nms_post=2000, max_num=2000, nms_thr=0.7, min_bbox_size=0), rcnn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.5, neg_iou_thr=0.5, min_pos_iou=0.5, match_low_quality=False, ignore_iof_thr=-1, iou_calculator=dict(type='BboxOverlaps2D')), sampler=dict( type='RandomSampler', num=512, pos_fraction=0.25, neg_pos_ub=-1, add_gt_as_proposals=True), pos_weight=-1, debug=False)) test_cfg = dict( rpn=dict( nms_across_levels=False, nms_pre=2000, nms_post=2000, max_num=2000, nms_thr=0.7, min_bbox_size=0), rcnn=dict( score_thr=0.05, nms=dict(type='obb_nms', iou_thr=0.1), max_per_img=2000)) work_dir = './work_dirs/faster_rcnn_obb_r50_fpn_3x_custom' gpu_ids = range(0, 1)

2022-05-21 04:49:34,389 - mmdet - INFO - load model from: torchvision://resnet50 2022-05-21 04:49:34,389 - mmdet - INFO - load checkpoint from torchvision path: torchvision://resnet50 2022-05-21 04:49:34,497 - mmdet - WARNING - The model and loaded state dict do not match exactly

unexpected key in source state_dict: fc.weight, fc.bias

ANN_FILE: /content/drive/MyDrive/Capstone_project_datasets/OBBDetection-master/data/train_annotation.pkl Traceback (most recent call last): File "/usr/local/lib/python3.7/dist-packages/mmcv/utils/registry.py", line 66, in build_from_cfg return obj_cls(**args) File "/content/drive/MyDrive/Capstone_project_datasets/OBBDetection-master/mmdet/datasets/custom.py", line 83, in init self.data_infos = self.load_annotations(self.ann_file) File "/content/drive/MyDrive/Capstone_project_datasets/OBBDetection-master/mmdet/datasets/custom.py", line 110, in load_annotations return mmcv.load(ann_file) File "/usr/local/lib/python3.7/dist-packages/mmcv/fileio/io.py", line 60, in load with BytesIO(file_client.get(file)) as f: File "/usr/local/lib/python3.7/dist-packages/mmcv/fileio/file_client.py", line 1015, in get return self.client.get(filepath) File "/usr/local/lib/python3.7/dist-packages/mmcv/fileio/file_client.py", line 535, in get with open(filepath, 'rb') as f: FileNotFoundError: [Errno 2] No such file or directory: '/content/drive/MyDrive/Capstone_project_datasets/OBBDetection-master/data/train_annotation.pkl'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "tools/train.py", line 153, in main() File "tools/train.py", line 128, in main datasets = [build_dataset(cfg.data.train)] File "/content/drive/MyDrive/Capstone_project_datasets/OBBDetection-master/mmdet/datasets/builder.py", line 63, in build_dataset dataset = build_from_cfg(cfg, DATASETS, default_args) File "/usr/local/lib/python3.7/dist-packages/mmcv/utils/registry.py", line 69, in build_from_cfg raise type(e)(f'{obj_cls.name}: {e}') FileNotFoundError: CustomDataset: [Errno 2] No such file or directory: '/content/drive/MyDrive/Capstone_project_datasets/OBBDetection-master/data/train_annotation.pkl' ------------------------------------------------------END-SCRIPT------------------------------------------------------

What is possibly causing this error?

kumarv987 commented 2 years ago

All good I figured it out