MIC-DKFZ / nnUNet

Apache License 2.0
6k stars 1.78k forks source link

About the model FLOPs and Parameters #2495

Open OCEANOUXIN opened 2 months ago

OCEANOUXIN commented 2 months ago

Hi Fabian, thanks for your excellent works! But I get few confusion. I worte a UNet.py code following your code(the same configurtion), i think i did not write wrong, but i get the following: ee9bdc88a6b211265599e0378a7daa82 3a5d6807ece81959dbfd0924bb5f1cea I don't why your model FLOPs and Parameters more than mine.

seziegler commented 2 months ago

Hi @OCEANOUXIN ,

this could have many reasons, but generally the U-Net topology in nnunet is also adapted to the dataset at hand. This means that depending on the dataset you will get different U-Nets with different amounts of parameters etc.

OCEANOUXIN commented 2 months ago

.

Thanks for your earily reply,@seziegler. But I used the same input for both of tow models,as following: image image There is a problem with profile module. I don't know whether the nnunet has the other not used modules and profile module also calculate the parts.Or maybe we should only calculate the input stream.

seziegler commented 2 months ago

You can check the exact architecture of the two models to see where they differ by using print(model)

OCEANOUXIN commented 2 months ago

You can check the exact architecture of the two models to see where they differ by using print(model)

Yes, I accturally did it. Then I find this in nnunet: 36d3347dc8fc3d12f58f2ac6ce01b266 I don't know why the tow parts are the same. How do they work? Whether both of they is useful? Thanks for your reply.

seziegler commented 2 months ago

Hi, this is just the representation of the model but the red boxes in your screenshot belong to only 1 layer. So in total there are two layers visible in the screenshot.

OCEANOUXIN commented 2 months ago

Hi, this is just the representation of the model but the red boxes in your screenshot belong to only 1 layer. So in total there are two layers visible in the screenshot.

Yes, I see the there are two layers, but what's the _allmodules function? Does the model use the module? Could we get rid of it?

seziegler commented 2 months ago

No the all_modules is just a summary of the above, it's not counting as new parameters so there is no need to get rid of it

OCEANOUXIN commented 2 months ago

No the all_modules is just a summary of the above, it's not counting as new parameters so there is no need to get rid of it

Hi, soroy to disturb you again, but I got another problem. About the deep-supervision, I'm wondering if you downsample the labels when using deep supervision, but doesn't that lose any information about labels?

OCEANOUXIN commented 2 months ago

No the all_modules is just a summary of the above, it's not counting as new parameters so there is no need to get rid of it

And, I want to know is it feasible to use deep supervision during the testing phase? Because I want to see if my data is top-sensitive or bottom-sensitive.