Learning Fine-grained Image Similarity with Deep Ranking is a novel application of neural networks, where the authors use a new multi scale architecture combined with a triplet loss to create a neural network that is able to perform image search. This repository is a simplified implementation of the same
Hi, I'm getting an error on my dataloader for the TINY dataset with custom root_dir:
Traceback (most recent call last):
File "model.py", line 59, in <module>
for i, (D, L, IDX) in enumerate(DATALOADER):
File ".../deep-ranking/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 819, in __next__
return self._process_data(data)
File .../deep-ranking/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 846, in _process_data
data.reraise()
File ".../deep-ranking/lib/python3.7/site-packages/torch/_utils.py", line 369, in reraise
raise self.exc_type(msg)
TypeError: Caught TypeError in DataLoader worker process 0.
Original Traceback (most recent call last):
File ".../deep-ranking/lib/python3.7/site-packages/torch/utils/data/_utils/worker.py", line 178, in _worker_loop
data = fetcher.fetch(index)
File ".../deep-ranking/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 47, in fetch
return self.collate_fn(data)
File ".../deep-ranking/lib/python3.7/site-packages/torch/utils/data/_utils/collate.py", line 80, in default_collate
return [default_collate(samples) for samples in transposed]
File ".../deep-ranking/lib/python3.7/site-packages/torch/utils/data/_utils/collate.py", line 80, in <listcomp>
return [default_collate(samples) for samples in transposed]
File "..../deep-ranking/lib/python3.7/site-packages/torch/utils/data/_utils/collate.py", line 82, in default_collate
raise TypeError(default_collate_err_msg_format.format(elem_type))
TypeError: default_collate: batch must contain tensors, numpy arrays, numbers, dicts or lists; found <class 'NoneType'>
the data path is correct, as the dataloader doesn't throw a 'file not found error'. Is the transformation to tensor missing perhaps? Thanks for your help
EDIT: I think this was caused by images[0],images[1],images[2],None,None in data_utils.py. Removing the Nones, I get this error in the forward function
RuntimeError: Expected 4-dimensional input for 4-dimensional weight 64 3 7, but got 3-dimensional input of size [3, 224, 224] instead
Hi, I'm getting an error on my dataloader for the TINY dataset with custom root_dir:
the data path is correct, as the dataloader doesn't throw a 'file not found error'. Is the transformation to tensor missing perhaps? Thanks for your help
EDIT: I think this was caused by
images[0],images[1],images[2],None,None
indata_utils.py
. Removing theNone
s, I get this error in theforward
function