HuguesTHOMAS / KPConv-PyTorch

Kernel Point Convolution implemented in PyTorch
MIT License
775 stars 154 forks source link

Errors in the original data sets #158

Open Shin-Kitahara opened 2 years ago

Shin-Kitahara commented 2 years ago

Hello Mr.Thomas. Thank you for your wonderful work. When I run my own data set, I get an error.

(Error Description) ________________________________________________________ RuntimeError: Caught RuntimeError in DataLoader worker process 0. RuntimeError: cannot perform reduction function argmin on a tensor with no elements because the operation does not have an identity ________________________________________________________

If you know the cause of these errors, please let me know. I apologize for my poor English.

HuguesTHOMAS commented 2 years ago

It seems that there is an empty array somewhere. You can investigate it by setting the number of worker to 0 in the train_XXXX.py script. The bug probably appears in the __getitem__ function of your dataset class

Shin-Kitahara commented 2 years ago

Hellow Mr.Thomas. Thanks for your help. I found an empty array in tensor. Please let me know if there is any cause for the empty array in tensor and what to do about it.

(below is the error msg) ________________________________________________________ Initialize workers tensor([], dtype=torch.float64) tensor([], dtype=torch.float64) Traceback (most recent call last): File "test_kyoukyaku.py", line 219, in tester.cloud_segmentation_test(net, test_loader, config) File "/home/tanaka/anaconda3/envs/kpconv/KPConv-PyTorch/utils/tester.py", line 244, in cloud_segmentation_test for i, batch in enumerate(test_loader): File "/home/tanaka/anaconda3/envs/kpconv/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 517, in next data = self._next_data() File "/home/tanaka/anaconda3/envs/kpconv/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 1199, in _next_data return self._process_data(data) File "/home/tanaka/anaconda3/envs/kpconv/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 1225, in _process_data data.reraise() File "/home/tanaka/anaconda3/envs/kpconv/lib/python3.6/site-packages/torch/_utils.py", line 429, in reraise raise self.exc_type(msg) RuntimeError: Caught RuntimeError in DataLoader worker process 0. Original Traceback (most recent call last): File "/home/tanaka/anaconda3/envs/kpconv/lib/python3.6/site-packages/torch/utils/data/_utils/worker.py", line 202, in _worker_loop data = fetcher.fetch(index) File "/home/tanaka/anaconda3/envs/kpconv/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/tanaka/anaconda3/envs/kpconv/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/tanaka/anaconda3/envs/kpconv/KPConv-PyTorch/datasets/kyoukyaku_S3DIS.py", line 233, in getitem return self.potential_item(batch_i) File "/home/tanaka/anaconda3/envs/kpconv/KPConv-PyTorch/datasets/kyoukyaku_S3DIS.py", line 294, in potential_item cloud_ind = int(torch.argmin(self.min_potentials)) RuntimeError: cannot perform reduction function argmin on a tensor with no elements because the operation does not have an identity ________________________________________________________

HuguesTHOMAS commented 2 years ago

The min_potentials are defined here: https://github.com/HuguesTHOMAS/KPConv-PyTorch/blob/e600c1667d085aeb5cf89d8dbe5a97aad4270d88/datasets/S3DIS.py#L182-L191

If self.min_potentials is empty, it means that self.pot_trees is too.

You probably modified the function self.load_subsampled_clouds() where self.pot_trees should be defined: https://github.com/HuguesTHOMAS/KPConv-PyTorch/blob/e600c1667d085aeb5cf89d8dbe5a97aad4270d88/datasets/S3DIS.py#L777-L821