Closed abcxubu closed 2 years ago
Hello, how did you solve this problem? thank you
I solved the problem. I did the RandomCrop to make all the cases have the same size, e.g., [96, 96, 96]. I hope it helps you.
Thank you, I found the problem and solved it. I appreciate your response.
Hi, Xiangde Recently, I want to use 'train_uncertainty_rectified_pyramid_consistency_3D.py' for the segmentation on my own data (heart CT). I have converted my own data to '.h5', as you did. I found maybe it's not suitable for me to do the RandomCrop, therefore, I abandon the RandomCrop. Then the code showed the mistake 'RuntimeError: stack expects each tensor to be equal size, but got [1, 191, 512, 512] at entry 0 and [1, 181, 512, 512] at entry 1'. I searched the Internet for answers, and I found others Resize the data in the preprocessing. However, for the heart CT, the dimension of data for person A is 191512512, and for person B maybe is 181512512, and for C is 200512512. I do not know how to resize or solve the problem. Could you give me some advice? Thanks so much.