drprojects / DeepViewAgg

[CVPR'22 Best Paper Finalist] Official PyTorch implementation of the method presented in "Learning Multi-View Aggregation In the Wild for Large-Scale 3D Semantic Segmentation"
Other
222 stars 24 forks source link

A problem about IndexError: index 8 is out of bounds for dimension 0 with size 5 #27

Closed Tommydied closed 1 year ago

Tommydied commented 1 year ago

Before this training session, I attempted to train a 3D monocular model using the KITTI360 dataset. After successfully completing the training, I attempted to switch the training data to the point cloud data I wanted to train with. However, I encountered the following error message after changing the training data.

Error executing job with overrides: ['data=segmentation/kitti360-sparse', 'models=segmentation/sparseconv3d', 'model_name=Res16UNet34', 'task=segmentation', 'training=kitti360_benchmark/sparseconv3d', 'lr_scheduler=multi_step_kitti360', 'eval_frequency=5', 'data.sample_per_epoch=12000', 'data.dataroot=./directory', 'data.train_is_trainval=False', 'data.mini=False', 'training.cuda=0', 'training.batch_size=8', 'training.epochs=60', 'training.num_workers=4', 'training.optim.base_lr=0.1', 'training.wandb.log=True', 'training.wandb.name=My_awesome_KITTI-360_experiment', 'tracker_options.make_submission=False', 'training.checkpoint_dir=']
Traceback (most recent call last):
  File "train.py", line 13, in main
    trainer = Trainer(cfg)
  File "/home/DeepViewAgg-release/torch_points3d/trainer.py", line 46, in __init__
    self._initialize_trainer()
  File "/home/DeepViewAgg-release/torch_points3d/trainer.py", line 92, in _initialize_trainer
    self._dataset: BaseDataset = instantiate_dataset(self._cfg.data)
  File "/home/DeepViewAgg-release/torch_points3d/datasets/dataset_factory.py", line 47, in instantiate_dataset
    dataset = dataset_cls(dataset_config)
  File "/home/DeepViewAgg-release/torch_points3d/datasets/segmentation/kitti360.py", line 881, in __init__
    transform=self.train_transform)
  File "/home/DeepViewAgg-release/torch_points3d/datasets/segmentation/kitti360.py", line 264, in __init__
    super().__init__(root, transform, pre_transform, pre_filter)
  File "/opt/conda/lib/python3.7/site-packages/torch_geometric/data/in_memory_dataset.py", line 55, in __init__
    pre_filter)
  File "/opt/conda/lib/python3.7/site-packages/torch_geometric/data/dataset.py", line 92, in __init__
    self._process()
  File "/opt/conda/lib/python3.7/site-packages/torch_geometric/data/dataset.py", line 165, in _process
    self.process()
  File "/home/DeepViewAgg-release/torch_points3d/datasets/segmentation/kitti360.py", line 532, in process
    self._process_3d(*path_tuple)
  File "/home/DeepViewAgg-release/torch_points3d/datasets/segmentation/kitti360.py", line 567, in _process_3d
    raw_window_path, instance=self._keep_instance, remap=True)
  File "/home/DeepViewAgg-release/torch_points3d/datasets/segmentation/kitti360.py", line 46, in read_kitti360_window
    data.y = torch.from_numpy(ID2TRAINID)[y] if remap else y
IndexError: index 8 is out of bounds for dimension 0 with size 5

I changed the labels in the kitti360_config.py file to :

     Label(  'Never classified '     ,  0 ,        -1 ,        0 , 'void'            , 0       , False        , True         , True          , (  150,150,150) ),
     Label(  'Unclassified'            ,  1 ,        -1 ,        1 , 'void'            , 1       , True         , False        , False         , (  217,217,217) ),
     Label(  'pipeline'                  ,  8 ,        -1 ,        2 , 'void'            , 2       , True         , False        , False         , (  241, 2, 2) ),
     Label(  'bracket'                  ,  19 ,       -1 ,        3 , 'void'            , 3       , True         , False        , False         , (   5,115,252) ),
     Label(  'hole'                       ,  20 ,       -1 ,        4 , 'void'            , 4       , True         , False        , False         , ( 105,248, 12) ),
drprojects commented 1 year ago

Hi, thanks for using this project !

If I understand correctly, you removed all the official KITTI-360 labels here by the lines you indicate above ?

If so, the error is to be expected: the read_kitti360_window will read the raw dataset files, recover the point indices (the id column in the kitti360_config.py file) and map them to their corresponding trainId. Since your labels only seem to span {0, 1, 8, 19, 20}, the program does not know how to map each raw id to the trainId of your choosing.

Said otherwise, if you want to build your own labels, you need to keep the same structure as here, and only fill the trainId column with:

Optionally, you can also modify the content of the columns name and color if you feel adventurous !

Please let me know if that solves your issue.

drprojects commented 1 year ago

I consider this issue solved, closing it for now. Feel free to re-open if need be.