Zyun-Y / DconnNet

Codes for CVPR2023 paper "Directional Connectivity-based Segmentation of Medical Images"
131 stars 7 forks source link

Resize Error #28

Closed d1488j closed 4 months ago

d1488j commented 4 months ago

Hello,

I want to use the DonnNet on an other Dataset (ACDC) with medical heart segmentation. For that I implemented an get_Datatset ACDC function, similar to the get_Dataset_CHASE. But for this datasaet I want to change the hyerparamter size with resize in train.py to 250,250 instead of 960x960. Than an Error occurs with dimensions in the model.DconnNet file in the forward path. I printed out the not matching dimensions, but I don't know how to fix it.

Train batch number: 8 Test batch number: 10 Petrain Model Have been loaded! START TRAIN.

Dimensionen von Tensor fb5(r5): torch.Size([5, 256, 8, 8]) Dimensionen von Tensor c4: torch.Size([5, 256, 7, 7])

Traceback (most recent call last): File "/misc/usrhomes/d1488/train.py", line 172, in main(args) File "/misc/usrhomes/d1488/train.py", line 167, in main solver.train(model, train_loader, val_loader, exp_id + 1, num_epochs=args.epochs) # Aufruf der Train-Methode des solvers File "/misc/usrhomes/d1488/solver.py", line 152, in train output, aux_out = net(X) # Modell (net) wird mit den Eingangsdaten X aufgerufen, um Vorhersagen (output) und optional auch Nebenausgaben (aux_out) zu generieren File "/no_backups/d1488/.pyenv/versions/venv/lib/python3.9/site-packages/torch/nn/modules/module.py", line 727, in _call_impl result = self.forward(*input, **kwargs) File "/misc/usrhomes/d1488/model/DconnNet.py", line 96, in forward d4=self.relu(self.fb5(r5)+c4) #256 RuntimeError: The size of tensor a (8) must match the size of tensor b (7) at non-singleton dimension 3

Can you please help me with this or describe what I should do additional to give the input data a resized dimension 250x250. I'm greatful for help.

Zyun-Y commented 4 months ago

Hi,

This is because 250 is not divisible by 32. Think of the encoder process as a series of downsampling, it will downsample the input 5 times and each time by 1/2 scale. Here are the possible solutions:

1) change your input size to a shape that is divisible by 32, e.g., 256, and resize it back to 250 after the prediction or at the end of the model if needed. 2) pad c4 from 7x7 to 8x8 before "d4=self.relu(self.fb5(r5)+c4) #256" and all following parts.

I would recommend the first one.

d1488j commented 4 months ago

Thanks a lot, now it works :)

Zyun-Y commented 4 months ago

Sounds good