jihunchoi / recurrent-batch-normalization-pytorch

PyTorch implementation of recurrent batch normalization
243 stars 34 forks source link

Bi-direction; weight in line 281 is not defiended #12

Open h-jia opened 5 years ago

h-jia commented 5 years ago

Thanks very much for releasing the code! Great job!

However I have met two questions during lingering the code. 1.How to realise Bi-Direnctional BathNormLSTM? 2.The weight in line 281 and 282 in bnlstm.py has not been defined. I have changed it into several ways of define but failed, any suggestions? specifically, the code: if hx is None: hx = (Variable(nn.init.xavier_uniform(weight.new(self.num_layers, batch_size, self.hidden_size))), Variable(nn.init.xavier_uniform(weight.new(self.num_layers, batch_size, self.hidden_size))))

songtaoshi commented 5 years ago

@h-jia hello I have also met the same problem, have you solved this problem

h-jia commented 5 years ago

@songtaoshi Here is my implementation to be compatible with Pytorch 1.0.