Closed blackyang closed 8 years ago
seems that there is a similar implementation in dpnn
, the PrintSize
layer:)
You can also add more outputs. Like:
input = nn.Identity()()
L1 = nn.Tanh()(nn.Linear(10, 20)(input))
L2 = nn.Tanh()(nn.Linear(30, 60)(nn.JoinTable(1)({input, L1})))
L3 = nn.Tanh()(nn.Linear(80, 160)(nn.JoinTable(1)({L1, L2})))
g = nn.gModule({input}, {L1,L2,L3})
@JoostvDoorn Thanks! But in that way we need to provide some zeroTensors as gradOutput when during backprop
Is there any neat way to debug nngraph? Thanks!
Currently I write a Debug layer which is almost the same as Identity layer except that I can print whatever I want inside
updateOutput()
andupdateGradInput()