After preparing LUNA dataset I got _clean.npy, _label.npy for training, and _spacing.npy, _extendbox.npy, _origin.npy data. But while running train_detector_se.py from luna_detector directory I am getting following error:
Traceback (most recent call last):
File "train_detector_se.py", line 338, in
main()
File "train_detector_se.py", line 113, in main
split_comber=split_comber)
File "/home/movchinar/home/movchinar/DeepSEED-3D-ConvNets-for-Pulmonary-Nodule-Detection/luna_detector/data_loader.py", line 56, in init
l = np.load(os.path.join(data_dir, '%s_label.npy' %idx))
File "/home/movchinar/.local/lib/python3.6/site-packages/numpy/lib/npyio.py", line 416, in load
fid = stack.enter_context(open(os_fspath(file), "rb"))
FileNotFoundError: [Errno 2] No such file or directory: '/fast-drive/movchinar/LUNA/prepross/668_label.npy'
In fact, there is no data with a 668 index.
Could you guide please the algo which builds indexes.
After preparing LUNA dataset I got _clean.npy, _label.npy for training, and _spacing.npy, _extendbox.npy, _origin.npy data. But while running train_detector_se.py from luna_detector directory I am getting following error:
Traceback (most recent call last): File "train_detector_se.py", line 338, in
main()
File "train_detector_se.py", line 113, in main
split_comber=split_comber)
File "/home/movchinar/home/movchinar/DeepSEED-3D-ConvNets-for-Pulmonary-Nodule-Detection/luna_detector/data_loader.py", line 56, in init
l = np.load(os.path.join(data_dir, '%s_label.npy' %idx))
File "/home/movchinar/.local/lib/python3.6/site-packages/numpy/lib/npyio.py", line 416, in load
fid = stack.enter_context(open(os_fspath(file), "rb"))
FileNotFoundError: [Errno 2] No such file or directory: '/fast-drive/movchinar/LUNA/prepross/668_label.npy'
In fact, there is no data with a 668 index. Could you guide please the algo which builds indexes.