Closed Alison-brie closed 4 years ago
Running on a different size (or a differently preprocessed dataset) requires retraining. To use our pretrained models, images should be resampled to 128^3. Using scipy.ndimage.zoom with order=1 is fine for images; order=0 for segmentation maps. Please also make sure that the your data is correctly preprocessed; for MRIs, skulls should be removed before feeding into the network, and the image should be normalized and cropped.
Thank you very much for your reply. The information about resampling is very helpful, also thanks for the hints about data preprocessing, my problem has been solved.
Thanks for sharing your work in this field.
I run the provided code on the MRI data of size (160, 192, 224) and before that, I have carefully modified the corresponding image_size. It gives results as:
"Assign requires shapes of both tensors to match. lhs shape= [3,3,4,512,9] rhs shape= [2,2,2,512,9] [[{{node save/Assign_20}} = Assign[T=DT_FLOAT, _class=["loc:@gaffdfrm/affine_stem/conv7_W/W"]"
I wonder does the data size has to remain (128, 128, 128) to run the code. If so, could you provide some suggestions about how to resample my data (volume and segmentation map) to (128, 128, 128) properly?
Ps: I have tried to resample with the scipy.ndimage.zoom() function, but it results in poor Dice scores.
Thanks a lot.