loicland / superpoint_graph

Large-scale Point Cloud Semantic Segmentation with Superpoint Graphs
MIT License
763 stars 214 forks source link

Testing on reduced testing set, IndexError: list index out of range #202

Closed Tojens closed 4 years ago

Tojens commented 4 years ago

Hi

I almost have your code running on a custom dataset, which has been formatted in the same manner as Semantic3D. I have fewer classes and I am pretty certain I have correctly changed the code accordingly. Once the testing begins, during the final phase of training, the following error pops up, however. I hope you might be able to help fix this :)

0%| | 0/10 [00:00<?, ?it/s]shape of input torch.Size([7893, 11, 128]) 10%|██▉ | 1/10 [00:18<02:43, 18.15s/it]shape of input torch.Size([9540, 11, 128]) 20%|█████▊ | 2/10 [00:41<02:38, 19.85s/it]shape of input torch.Size([4809, 11, 128]) 30%|████████▋ | 3/10 [00:53<02:01, 17.36s/it]shape of input torch.Size([2325, 11, 128]) 40%|███████████▌ | 4/10 [00:58<01:22, 13.75s/it]shape of input torch.Size([1862, 11, 128]) 50%|██████████████▌ | 5/10 [01:02<00:54, 10.84s/it]shape of input torch.Size([5426, 11, 128]) 60%|█████████████████▍ | 6/10 [01:15<00:45, 11.37s/it]shape of input torch.Size([9907, 11, 128]) 70%|████████████████████▎ | 7/10 [01:39<00:45, 15.15s/it]shape of input torch.Size([8198, 11, 128]) 80%|███████████████████████▏ | 8/10 [02:00<00:33, 16.92s/it]shape of input torch.Size([182, 11, 128]) 90%|██████████████████████████ | 9/10 [02:00<00:11, 11.96s/it]shape of input torch.Size([2851, 11, 128]) 100%|████████████████████████████| 10/10 [02:07<00:00, 10.40s/it] Traceback (most recent call last): File "learning/main.py", line 459, in <module> main() File "learning/main.py", line 381, in main acc_test, oacc_test, avg_iou_test, per_class_iou_test, predictions_test, avg_acc_test, confusion_matrix = eval_final() File "learning/main.py", line 309, in eval_final per_class_iou[name] = perclsiou[c] IndexError: list index out of range

Best regards, Tobias

loicland commented 4 years ago

Hi,

first question: this shape of input is an output that you added right? Not some flag I forgot to remove?

As for your error, can you print dbinfo['inv_class_map'] line 308 of eval_final?

Tojens commented 4 years ago

Yeah that's a flag I've put in :)

I printed both db['inv_class_map] and perclsiou[c], which is what is printed below:

100%|████████████████████████████| 10/10 [02:05<00:00, 10.31s/it]

dbinfo: {1: 'terrain', 2: 'vegetation', 3: 'noise_lower', 4: 'ledninger', 5: 'crossbeam', 6: 'noise_upper'} 0.9135548819785088 dbinfo: {1: 'terrain', 2: 'vegetation', 3: 'noise_lower', 4: 'ledninger', 5: 'crossbeam', 6: 'noise_upper'} 0.8205557415467024 dbinfo: {1: 'terrain', 2: 'vegetation', 3: 'noise_lower', 4: 'ledninger', 5: 'crossbeam', 6: 'noise_upper'} 0.011051764765500292 dbinfo: {1: 'terrain', 2: 'vegetation', 3: 'noise_lower', 4: 'ledninger', 5: 'crossbeam', 6: 'noise_upper'} 0.0 dbinfo: {1: 'terrain', 2: 'vegetation', 3: 'noise_lower', 4: 'ledninger', 5: 'crossbeam', 6: 'noise_upper'} 0.46641487696217115 Traceback (most recent call last): File "learning/main.py", line 460, in <module> main() File "learning/main.py", line 382, in main acc_test, oacc_test, avg_iou_test, per_class_iou_test, predictions_test, avg_acc_test, confusion_matrix = eval_final() File "learning/main.py", line 309, in eval_final print('dbinfo: ', dbinfo['inv_class_map'], perclsiou[c]) IndexError: list index out of range

Not exactly sure what might be the problem. Should I have one 0-class? Because I've removed that class.

Tojens commented 4 years ago

Have you had a chance to take a look at a possible cause?

loicland commented 4 years ago

sorry I forgot to answer.

There is an odd design choice we made wrt labels, but which makes implementation easier:

Hence you should update your class_inv_map in the get_info. Use the ones we wrote for s3dis and sema3d as examples.

More details in #45 #63 #83 #108 #166

And let know if it fixes things!

Tojens commented 4 years ago

It worked! Thanks for the help, it has really helped my work along.

Stay healthy