emited / VariationalRecurrentNeuralNetwork

Pytorch implementation of the Variational Recurrent Neural Network (VRNN).
282 stars 70 forks source link

Repackaging of states necessary? #1

Closed maximilianigl closed 7 years ago

maximilianigl commented 7 years ago

Hi,

in the VRNN class (in model.py), you cut the gradients behind h_{t-1} using the _repackage_state() function. I've been thinking about this question for a while now and would have said that the correct thing to do is to not cut the gradients because nothing in the paper indicates that one should.

May I ask what your reasoning is? - I'm not very sure about mine.

Thanks! Best, Max

emited commented 7 years ago

Indeed, you are correct. This was my first pytorch project and at the time I thought that this step was necessary for any models with recurrent states. Thanks for that.