Open bymbhaskar opened 3 years ago
Looks like a numba problem. Maybe switch to another numba version will fix it?
I downgraded the numba version to 0.39.0 (tried with 0.40.0). Now following error is coming:
python train_SemanticKITTI.py
Traceback (most recent call last):
File "train_SemanticKITTI.py", line 13, in
This error comes from the llvmlite
package. You can try this fix first: https://github.com/rapidsai/cuml/issues/2389.
If it does not work either, you can use the same version in my environment:
numpy 1.18.2
numba 0.39.0
llvmlite 0.24.0
I downgraded the llvmlite to
Now following error is coming:
python train_SemanticKITTI.py
train_SemanticKITTI.py
Namespace(check_iter=4000, data_dir='data', grid_size=[480, 360, 32], model='polar', model_save_path='./SemKITTI_PolarSeg.pt', train_batch_size=2, val_batch_size=2)
0%| | 0/9565 [00:00<?, ?it/s]Traceback (most recent call last):
File "train_SemanticKITTI.py", line 197, in
0%|
I tried the version specified by you numpy 1.18.2 numba 0.39.0 llvmlite 0.24.0
now its showing
python train_SemanticKITTI.py Segmentation fault (core dumped)
I found that error is coming in importing that is :
from network.ptBEV import ptBEVnet
is this required if I want to work on KITTI only?
Yes, it is required. It's the feature encoder in our model.
Segmentation fault (core dumped)
Segmentation fault (core dumped), Hello, how to fix this error, hope to your reply, thanks!
Segmentation fault (core dumped)
Segmentation fault (core dumped), Hello, how to fix this error, hope to your reply, thanks!
Hello, I also met this problem, how to fix this error, hope to your reply, thanks!
Error log:
python train_SemanticKITTI.py train_SemanticKITTI.py Namespace(check_iter=4000, data_dir='data', grid_size=[480, 360, 32], model='polar', model_save_path='./SemKITTI_PolarSeg.pt', train_batch_size=2, val_batch_size=2) 0%| | 0/9565 [00:00<?, ?it/s]Traceback (most recent call last): File "train_SemanticKITTI.py", line 197, in
main(args)
File "train_SemanticKITTI.py", line 107, in main
for iiter,(,train_vox_label,traingrid,,train_pt_fea) in enumerate(train_dataset_loader):
File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 435, in next
data = self._next_data()
File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1085, in _next_data
return self._process_data(data)
File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1111, in _process_data
data.reraise()
File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/_utils.py", line 428, in reraise
raise self.exc_type(msg)
TypeError: Caught TypeError in DataLoader worker process 0.python train_SemanticKITTI.py
train_SemanticKITTI.py
Namespace(check_iter=4000, data_dir='data', grid_size=[480, 360, 32], model='polar', model_save_path='./SemKITTI_PolarSeg.pt', train_batch_size=2, val_batch_size=2)
0%| | 0/9565 [00:00<?, ?it/s]Traceback (most recent call last):
File "train_SemanticKITTI.py", line 197, in
main(args)
File "train_SemanticKITTI.py", line 107, in main
for iiter,(,train_vox_label,traingrid,,train_pt_fea) in enumerate(train_dataset_loader):
File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 435, in next
data = self._next_data()
File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1085, in _next_data
return self._process_data(data)
File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1111, in _process_data
data.reraise()
File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/_utils.py", line 428, in reraise
raise self.exc_type(msg)
TypeError: Caught TypeError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/utils/data/_utils/worker.py", line 198, in _worker_loop
data = fetcher.fetch(index)
File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/lidar/bhaskar/PolarSeg/dataloader/dataset.py", line 238, in getitem
processed_label = nb_process_label(np.copy(processed_label),label_voxel_pair)
TypeError: expected dtype object, got 'numpy.dtype[uint8]'
0%| | 0/9565 [00:01<?, ?it/s]
Original Traceback (most recent call last): File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/utils/data/_utils/worker.py", line 198, in _worker_loop data = fetcher.fetch(index) File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/lidar/bhaskar/polarenv/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/lidar/bhaskar/PolarSeg/dataloader/dataset.py", line 238, in getitem
processed_label = nb_process_label(np.copy(processed_label),label_voxel_pair)
TypeError: expected dtype object, got 'numpy.dtype[uint8]'
0%| | 0/9565 [00:01<?, ?it/s]