voldemortX / DST-CBC

Implementation of our Pattern Recognition paper "DMT: Dynamic Mutual Training for Semi-Supervised Learning"
BSD 3-Clause "New" or "Revised" License
135 stars 17 forks source link

Question about label mapping for cityscapes dataset #10

Closed revaeb closed 3 years ago

revaeb commented 3 years ago

When I was using part of your code about cityscapes benchmark, I met up with the error that

IndexError: Caught IndexError in DataLoader worker process 0. Original Traceback (most recent call last): File "/usr/local/lib/python3.6/dist-packages/torch/utils/data/_utils/worker.py", line 185, in _worker_loop data = fetcher.fetch(index) File "/usr/local/lib/python3.6/dist-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/usr/local/lib/python3.6/dist-packages/torch/utils/data/_utils/fetch.py", line 44, in data = [self.dataset[idx] for idx in possibly_batched_index] File "../utils/datasets.py", line 155, in getitem img1, target1 = self.transforms(img, target) File "../utils/transforms.py", line 27, in call image, target = t(image, target) File "../utils/transforms.py", line 216, in call target = target if type(target) == str else self.label_id_map[target] IndexError: index 255 is out of bounds for dimension 0 with size 34

It seems that the LabelMap(label_id_map_city), didn't work correctly. It's the first time to using this benchmark, so I dont know how to deal with this problem, could you plz give me some hints?

voldemortX commented 3 years ago

@revaeb If you are using only part of my code, I'm not sure I can resolve the entire problem for you.

However, I can give you some background on the Cityscapes dataset.

  1. Cityscapes has more classes than the classes used in segmentation training, thus a mapping is required to convert the original label.

  2. In segmentation, ignored labels are usually set as 255. In an array mapping of label ids, you can't make a size 255 array for efficient mapping, for some datasets you need to map these ignored labels like this. Although I don't think Cityscapes has this problem. It is possible your dataset class has the label mapping coded, thus you do not need the LabelMap.

revaeb commented 3 years ago

@revaeb If you are using only part of my code, I'm not sure I can resolve the entire problem for you.

However, I can give you some background on the Cityscapes dataset.

  1. Cityscapes has more classes than the classes used in segmentation training, thus a mapping is required to convert the original label.
  2. In segmentation, ignored labels are usually set as 255. In an array mapping of label ids, you can't make a size 255 array for efficient mapping, for some datasets you need to map these ignored labels like this. Although I don't think Cityscapes has this problem. It is possible your dataset class has the label mapping coded, thus you do not need the LabelMap.

Feel sorry that I didn’t make it clear. I’m using your code as my code base now and it works really well in Voc2012 benchmark. But when i was trying to use it in cityscapes benchmark, the error mentioned above came up. I just followed your instruction that downloaded the dataset, put it into the directory and generated the data splits. So I’m curious about why I can’t use your dataloader correctly. I found that there’s a cityscapes official script named label.py to map the Id into train id, which is not mentioned in your instruction, should I do such thing too? Thank you soooooo much for your kind reply.

voldemortX commented 3 years ago

I don't remember any processing from the official scripts, maybe try add the 255 filtering? Cityscapes original annotation shouldn't have 255.

voldemortX commented 3 years ago

@revaeb I do think of one case that would introduce 255 in my code. If you use any transform that would introduce padding (e.g. translation) before LabelMap, it would cause this error.

revaeb commented 3 years ago

@revaeb I do think of one case that would introduce 255 in my code. If you use any transform that would introduce padding (e.g. translation) before LabelMap, it would cause this error.

Uhhh, yes. I used padding to fill the cropped images and labels, I think I know why it come up with this error. Thank you soooooo much for your excellent work and your help tonight!

voldemortX commented 3 years ago

You're welcome! I'll close this issue for now since your problem is resolved.

Feel free to reopen if there are further problems.