then when run to for ibs, batch in enumerate(train_loader):, it reports:
Traceback (most recent call last):
File "/home/ouc/anaconda3/envs/Kiruto/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 724, in _try_get_data
data = self.data_queue.get(timeout=timeout)
File "/home/ouc/anaconda3/envs/Kiruto/lib/python3.6/multiprocessing/queues.py", line 104, in get
if not self._poll(timeout):
File "/home/ouc/anaconda3/envs/Kiruto/lib/python3.6/multiprocessing/connection.py", line 257, in poll
return self._poll(timeout)
File "/home/ouc/anaconda3/envs/Kiruto/lib/python3.6/multiprocessing/connection.py", line 414, in _poll
r = wait([self], timeout)
File "/home/ouc/anaconda3/envs/Kiruto/lib/python3.6/multiprocessing/connection.py", line 911, in wait
ready = selector.select(timeout)
File "/home/ouc/anaconda3/envs/Kiruto/lib/python3.6/selectors.py", line 376, in select
fd_event_list = self._poll.poll(timeout)
File "/home/ouc/anaconda3/envs/Kiruto/lib/python3.6/site-packages/torch/utils/data/_utils/signal_handling.py", line 66, in handler
_error_if_any_worker_fails()
RuntimeError: DataLoader worker (pid 581) is killed by signal: Segmentation fault.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ouc/TXH/IDEA_code/PVN3D/pvn3d/train/train_linemod_pvn3d.py", line 366, in train
for ibs, batch in enumerate(train_loader):
File "/home/ouc/anaconda3/envs/Kiruto/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 804, in __next__
idx, data = self._get_data()
File "/home/ouc/anaconda3/envs/Kiruto/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 771, in _get_data
success, data = self._try_get_data()
File "/home/ouc/anaconda3/envs/Kiruto/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 737, in _try_get_data
raise RuntimeError('DataLoader worker (pid(s) {}) exited unexpectedly'.format(pids_str))
RuntimeError: DataLoader worker (pid(s) 581) exited unexpectedly
I try to set less num_workers and min_batch_size, it not works
when I set num_workers=0, it shows as folow and also not work.
Because it reports
ValueError: current limit exceeds maximum limit
, so I change as follow:then when run to
for ibs, batch in enumerate(train_loader):
, it reports:I try to set less
num_workers
andmin_batch_size
, it not works when I setnum_workers=0
, it shows as folow and also not work.It is the error about my device or version of torch?