-
See post by http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns/
So here are details that we should add:
- Add more RNN introduction
- Back propagation throug…
-
In the newly added LSTM examples, the LSTM implementation seems not as modular as it could be. The API can be improved to [composite more easily with other symbols](https://github.com/dmlc/mxnet/blob/…
-
Thank you for this awesome package. Much wow.
I cannot use _dp_ which all examples are based on (https://github.com/Element-Research/rnn/issues/60).
So I have to make sure _optim_ is supported for ar…
ghost updated
8 years ago
-
Hi,
Thanks for the great work. I wonder if there is a way to make copies of a time step of a LSTM. I am trying to implement beam search with a normal LSTM. This requires me to dynamically branching a…
-
Hi guys,
I modified the example of recurrent-language-model in rnn/examples/recurrent-language-model.lua to handle video data. In the example, the FastLSTM module is decorated with a Sequencer modul…
-
Hey guys,
I was wondering how are the initial internal states in a recurrent layer dealt with? So far it appears they are reset are every run. Is there any way to preserve them?
I'd like to be able …
-
Peter,
Thank you so much for the great RNN tutorial post. This might seem long, but it is very quick.
1 - For Part 1, you defined the states array S to be 1x1. How will your example change if one d…
-
I wrote a code of GRU-RNN and faced a problem with scan.
the class variable will get input size, embedding size and hidden size and etc as input. the hidden size can be list or just a number(list mean…
-
Thanks for writing the library!
What's the best way to make multi-layer LSTMs without using the Sequencer? (I am finding Sequencer() to be slightly too memory intensive for my dataset so I want to d…
-
I'm attempting to recreate this [model](https://github.com/bplank/bilstm-aux) in TensorFlow but I can not seem to get anywhere close to the speed of the original code. On average the TensorFlow implem…