When I cloned the repository and ran main.py (without changing anything) I get this output:
not enough values to unpack (expected 2, got 0)
==> Preparing data..
Files already downloaded and verified
Files already downloaded and verified
Training with dataset CIFAR10 and 10 classes
==> Building model..
==> Checkpoints will be saved to: ./checkpoint/ckpt-CIFAR10-ResNet18.pth
classes: (callable)
Epoch: 0
Traceback (most recent call last):
File "<ipython-input-7-f9bd8031870b>", line 1, in <module>
runfile('C:/Users/Matthew Chen/Documents/GitHub/neural-backed-decision-trees/main.py', wdir='C:/Users/Matthew Chen/Documents/GitHub/neural-backed-decision-trees')
File "C:\Users\Matthew Chen\Anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 827, in runfile
execfile(filename, namespace)
File "C:\Users\Matthew Chen\Anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 110, in execfile
exec(compile(f.read(), filename, 'exec'), namespace)
File "C:/Users/Matthew Chen/Documents/GitHub/neural-backed-decision-trees/main.py", line 315, in <module>
train(epoch, analyzer)
File "C:/Users/Matthew Chen/Documents/GitHub/neural-backed-decision-trees/main.py", line 227, in train
for batch_idx, (inputs, targets) in enumerate(trainloader):
File "C:\Users\Matthew Chen\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 279, in __iter__
return _MultiProcessingDataLoaderIter(self)
File "C:\Users\Matthew Chen\Anaconda3\lib\site-packages\torch\utils\data\dataloader.py", line 719, in __init__
w.start()
File "C:\Users\Matthew Chen\Anaconda3\lib\multiprocessing\process.py", line 112, in start
self._popen = self._Popen(self)
File "C:\Users\Matthew Chen\Anaconda3\lib\multiprocessing\context.py", line 223, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
File "C:\Users\Matthew Chen\Anaconda3\lib\multiprocessing\context.py", line 322, in _Popen
return Popen(process_obj)
File "C:\Users\Matthew Chen\Anaconda3\lib\multiprocessing\popen_spawn_win32.py", line 89, in __init__
reduction.dump(process_obj, to_child)
File "C:\Users\Matthew Chen\Anaconda3\lib\multiprocessing\reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
BrokenPipeError: [Errno 32] Broken pipe
I'm not sure if I downloaded all the packages correctly, but this seems to be an error where the request to some server is blocked/timed out?
When I cloned the repository and ran main.py (without changing anything) I get this output:
I'm not sure if I downloaded all the packages correctly, but this seems to be an error where the request to some server is blocked/timed out?