Namespace(batch_size=1, checkpoint='', classes=11, cuda=True, dataset='camvid', gpus='0', model='ENet', num_workers=2, save_seg_dir='./server/camvid/predict/ENet')
=====> use gpu id: '0'
find file: ./dataset/inform/camvid_inform.pkl
length of Validation set: 233
=====> beginning testing
test set length: 233
Traceback (most recent call last):
File "/home/Downloads/Efficient-Segmentation-Networks/predict.py", line 119, in
test_model(args)
File "/home/Downloads/Efficient-Segmentation-Networks/predict.py", line 102, in test_model
predict(args, testLoader, model)
File "/home/Downloads/Efficient-Segmentation-Networks/predict.py", line 44, in predict
for i, (input, size, name) in enumerate(test_loader):
ValueError: too many values to unpack (expected 3)
If you want to predict the Camvid dataset, you should use the test.py, predict.py is just used to get the Cityscapes dataset predictions for upload to the official url evaluation.
Namespace(batch_size=1, checkpoint='', classes=11, cuda=True, dataset='camvid', gpus='0', model='ENet', num_workers=2, save_seg_dir='./server/camvid/predict/ENet') =====> use gpu id: '0' find file: ./dataset/inform/camvid_inform.pkl length of Validation set: 233 =====> beginning testing test set length: 233 Traceback (most recent call last): File "/home/Downloads/Efficient-Segmentation-Networks/predict.py", line 119, in
test_model(args)
File "/home/Downloads/Efficient-Segmentation-Networks/predict.py", line 102, in test_model
predict(args, testLoader, model)
File "/home/Downloads/Efficient-Segmentation-Networks/predict.py", line 44, in predict
for i, (input, size, name) in enumerate(test_loader):
ValueError: too many values to unpack (expected 3)