Closed buaazsj closed 5 years ago
Hi, strange. That should not happen. What is stored in data and what is stored in seg? Can you please run np.unique(data_dict['seg'])? Can you also try to set order_seg=0 in the SpatialTransform? Best, Fabian
Hi, strange. That should not happen. What is stored in data and what is stored in seg? Can you please run np.unique(data_dict['seg'])? Can you also try to set order_seg=0 in the SpatialTransform? Best, Fabian
Hi, FabianIsensee First of all, thank you very much for your answer. After I set order_seg = 0, it really worked and very fast. The following is the result after run np.unique(batch['seg']). Unlike the brats2017, the 'seg' here is a value that varies continuously from 0 to 1. Why change the order_seg, I am still unclear. Will changing the order_seg have other effects? Thanks!
Thanks! I looks like you made a mistake in resampling the segmentation. After your resampling it seems like seg is not anymore a segmentation map (discrete values) but has now continuous values. batchgenerators resamples segmentations properly (to avoid interpolation artifacts) and thus needs to process each label (i.e. each unique value) separately. If you plug a float map into the 'seg' key then this is going to take ages because there are MANY unique values. If I remember correctly, ISLES has only 2 labels (0=background, 1=lesion), so in your case you can just np.round seg and the problem goes away :-) Best, Fabian
I changed the data set to ISLES2018, and resize to (32,128128). When the data be generated in batches,it will get stucked. After debugging, I found out when the spatialtransform was removed, it will work. The data augmentation was so slow, but brats2017 was not slow. Why is that? Can anyone help me?