torch / nngraph

Graph Computation for nn
Other
299 stars 97 forks source link

Questions about reading pretrain model weights in nngraph #141

Open xieshuqin opened 7 years ago

xieshuqin commented 7 years ago

hi there

I try to use a pretrain model to initial my own model's weight. Both of them are nngraph and the network structures are almost the same. I expect after the initialization, they should have the same outputs. After copying the weights, I forward both network with the same random input, but the outputs are different. I am sure that I have correctly copy the weights from one to another. This is done by calling "torch.sum(torch.ne(param1, param2))" where the param1, param2 are the parameters of the two network. The outputs are different after they forward nn.SpatialBatchNormaliztion. I am sure that the params in nn.SpatialBatchNormalization are the same. Is there any way to fix this? Or Is there a better way to directly modify the structure of a 'nngraph' model?

arthitag commented 7 years ago

Hi @xieshuqin , I am facing the exact same issue. Did you figure out why the BatchNorm layer acts so weird? Please share. Thanks!