Closed vivanov879 closed 7 years ago
Hi, Can you clarify what you mean with "remembering states"? Do you mean that you would like to have access to the states of all the layers?
it's seems no way to pass the initial state to multiPLSTM, how can i resume the previous rnn state?
Or can i use MultiRNNCell
with PhasedLSTMCell
directly?
made a pull request
@indiejoseph
I'm not sure is gonna be possible to use MultiRNNCell
because we need to add the time input between layers.
But now you can use multiPLSTM
to resume the previous state.
thanks a lot -- all's working -- now running your code : /usr/bin/python3.5 /home/vivanov/PycharmProjects/PLSTM/simplePhasedLSTM.py Compiling RNN... DONE! Compiling cost functions... DONE! /usr/local/lib/python3.5/dist-packages/tensorflow/python/ops/gradients_impl.py:91: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory. "Converting sparse IndexedSlices to a dense Tensor of unknown shape. " Calculating gradients... DONE! Initializing variables... DONE! +-----------+----------+------------+ | Epoch=0 | Cost | Accuracy | +===========+==========+============+ | Train | 0.482815 | 0.785156 | +-----------+----------+------------+ | Test | 0.245212 | 0.9375 | +-----------+----------+------------+
Hi, How to remember states of lstm between runs?