Open TinusChen opened 6 years ago
I found a cache directory and fixed the problem like this: rm data/cache -r
Maybe I've used python3 to train the model which led to a serialization compatibility issue.
you can modify this file ./home/wangwenzhong/project/test/tf-faster-rcnn/lib/datasets/voc_eval.py.
# load annotations
recs = {} for i, imagename in enumerate(imagenames): recs[imagename] = parse_rec(annopath.format(imagename)) if i % 100 == 0: print('Reading annotation for {:d}/{:d}'.format( i + 1, len(imagenames)))
print('Saving cached annotations to {:s}'.format(cachefile)) with open(cachefile, 'w') as f: pickle.dump(recs, f)
# load
I also have this problem. Have you fixed it? how to do it?
Run trainning shell to train our own dataset: ./experiments/scripts/train_faster_rcnn.sh 0 pascal_voc vgg16 or ./experiments/scripts/train_faster_rcnn.sh 0 pascal_voc res101
Error log: File "/path/to/tf-faster-rcnn-master/tools/../lib/datasets/pascal_voc.py", line 105, in gt_roidb roidb = pickle.load(fid, encoding='bytes') TypeError: load() got an unexpected keyword argument 'encoding'
Code:
if os.path.exists(cache_file):
with open(cache_file, 'rb') as fid:
try:
roidb = pickle.load(fid)
except:
roidb = pickle.load(fid, encoding='bytes')
print('{} gt roidb loaded from {}'.format(self.name, cache_file))
return roidb
Env: OS: Ubuntu 16.04 Python: anaconda python 2.7.12