Open CarbonPool opened 3 years ago
Torch's model has some temporary buffers. Most of them are removed by clearState()
when saving the model, but some remains. These do not affect the scaling/denoising result.
The default pretrained models has been rebuild by tools/cudnn2cunn.lua
. At that time, all unnecessary buffers are removed, so the size is small.
I noticed that my largest image size is 4096x2621. When the model training is completed, the model file is several hundred MB, but the default pre-trained model cunet model file is only about 20MB, because I don’t understand these contents, I just guess the larger the size will help Image enlargement retains clearer content, but also produces a larger segmentation training time.