torch / nn

Other
1.35k stars 967 forks source link

Supporting modules to take modules rather than fixed parameters as input arguments? #798

Open zizhaozhang opened 8 years ago

zizhaozhang commented 8 years ago

I am wondering if it is good to let the modules to support taking modules as input so as to dynamically adjust the parameters accordingly.

For example, in Fully Convolutional Neutral Network (or Faster-RCNN), the networks can take arbitrary input size and there will be a "crop" (or padding) layers whose parameters is depend on the input test image size. These kinds of layers need to dynamically adjust the crop size according to the input size which is unknown during the network construction. I think it is a very important character to enable end-to-end training. In torch, if you add a nn.Identity() module at the beginning of the network and let the crop layer to get the output size of this module during network construction, in the testing stage, the crop layer can infer the size dynamically. I personally implement a such layer which is very simple - just the allow the constructor to take a module and get the size of output variable of the module argument to decide the crop size.

But I am not sure if it is a good way to go for torch?

JiayunLi commented 7 years ago

Hi, I encountered with the similar problem. I found your solution is very cool. Are you able to get the output sizes of a module during network construction? I am only able to get it after forward inputs. Did I miss anything?

zizhaozhang commented 7 years ago

When writing you module in updateOutput, you need to a condition to know if you receive an input during forwarding, if yes then you calculate the sizes. You have to write it in updateOutput so that nn will not check that during initialization.

JiayunLi commented 7 years ago

I see. It works now. Thanks!