torch / nngraph

Graph Computation for nn
Other
299 stars 96 forks source link

Any neat way to debug? #110

Closed blackyang closed 8 years ago

blackyang commented 8 years ago

Is there any neat way to debug nngraph? Thanks!

Currently I write a Debug layer which is almost the same as Identity layer except that I can print whatever I want inside updateOutput() and updateGradInput()

blackyang commented 8 years ago

seems that there is a similar implementation in dpnn, the PrintSize layer:)

JoostvDoorn commented 8 years ago

You can also add more outputs. Like:

input = nn.Identity()()
L1 = nn.Tanh()(nn.Linear(10, 20)(input))
L2 = nn.Tanh()(nn.Linear(30, 60)(nn.JoinTable(1)({input, L1})))
L3 = nn.Tanh()(nn.Linear(80, 160)(nn.JoinTable(1)({L1, L2})))

g = nn.gModule({input}, {L1,L2,L3})
blackyang commented 8 years ago

@JoostvDoorn Thanks! But in that way we need to provide some zeroTensors as gradOutput when during backprop

blackyang commented 7 years ago

FYI, pytorch introduces similar functionalities: register_forward_hook and register_backward_hook