e-lab / torch-toolbox

A collection of snippets and libraries for Torch from e-Lab
https://engineering.purdue.edu/elab/
199 stars 64 forks source link

Do I need re-initialization after cudnn.covert #22

Closed byronwwang closed 7 years ago

byronwwang commented 7 years ago

I have a question here. We I build a network, I use the methods in Weight-init.lua to initialize the weights of my network which are all 'nn' layers. But, sometimes I need use covert the net to cudnn version, Do I need re-initialization after cudnn.covert(model, cudnn)? Any easy way?

jhjin commented 7 years ago

I do not think reinitialization is needed after the conversion. cudnn.convert swaps metatables therefore should not affect the actual parameters.

On Tue, Dec 20, 2016 at 6:57 PM, byronwwang notifications@github.com wrote:

I have a question here. We I build a network, I use the methods in Weight-init.lua to initialize the weights of my network which are all 'nn' layers. But, sometimes I need use covert the net to cudnn version, Do I need re-initialization after cudnn.covert(model, cudnn)? Any easy way?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/e-lab/torch-toolbox/issues/22, or mute the thread https://github.com/notifications/unsubscribe-auth/ABVOd5QBvNs2ctKHnk0Q4ucMn2zyirzDks5rKJWJgaJpZM4LShlW .

byronwwang commented 7 years ago

@jhjin Thanks for your reply. It seems true.