I encountered the following errors while running the pre-training task of large data sets and hope the author can provide answers.
Start to load Faster-RCNN detected objects from data/mscoco_imgfeat/train2014_obj36.tsv
Traceback (most recent call last):
File "src/pretrain/lxmert_pretrain.py", line 46, in
train_tuple = get_tuple(args.train, args.batch_size, shuffle=True, drop_last=True)
File "src/pretrain/lxmert_pretrain.py", line 33, in get_tuple
tset = LXMERTTorchDataset(dset, topk)
File "/home/af/Downloads/zyd/lxmert-master/src/pretrain/lxmert_data.py", line 101, in init
img_data.extend(load_obj_tsv(Split2ImgFeatPath[source], topk))
File "/home/af/Downloads/zyd/lxmert-master/src/utils.py", line 45, in load_obj_tsv
item[key] = np.frombuffer(base64.b64decode(item[key]), dtype=dtype)
ValueError: buffer size must be a multiple of element size
I encountered the following errors while running the pre-training task of large data sets and hope the author can provide answers.
Start to load Faster-RCNN detected objects from data/mscoco_imgfeat/train2014_obj36.tsv Traceback (most recent call last): File "src/pretrain/lxmert_pretrain.py", line 46, in
train_tuple = get_tuple(args.train, args.batch_size, shuffle=True, drop_last=True)
File "src/pretrain/lxmert_pretrain.py", line 33, in get_tuple
tset = LXMERTTorchDataset(dset, topk)
File "/home/af/Downloads/zyd/lxmert-master/src/pretrain/lxmert_data.py", line 101, in init
img_data.extend(load_obj_tsv(Split2ImgFeatPath[source], topk))
File "/home/af/Downloads/zyd/lxmert-master/src/utils.py", line 45, in load_obj_tsv
item[key] = np.frombuffer(base64.b64decode(item[key]), dtype=dtype)
ValueError: buffer size must be a multiple of element size