Closed alexander34ro closed 3 years ago
Theory: http://colah.github.io/posts/2015-08-Understanding-LSTMs/ Code examples: https://stackoverflow.com/questions/54767816/how-exactly-does-lstmcell-from-tensorflow-operates
The code example is a StackOverflow thread on implementing tf.compat.v1.nn.rnn_cell.LSTMCell
.
For Bi-Directional RNN we can use the following references: Theory: https://d2l.ai/chapter_recurrent-modern/bi-rnn.html Code examples: https://mxnet.apache.org/versions/1.6/api/python/docs/api/gluon/rnn/index.html#mxnet.gluon.rnn.LSTMCell
The code example details the equations used for mxnet.gluon.rnn.LSTMCell
.
We can use two tf.nn.rnn_cell.BasicLSTMCell
components to replicate a Bi-Directional LSTM.
https://www.tensorflow.org/api_docs/python/tf/compat/v1/nn/bidirectional_dynamic_rnn
Guide: https://riptutorial.com/tensorflow/example/17004/creating-a-bidirectional-lstm
# Hyper parameters
lstm_units = 200
dropout_units = 0.5
# Single forward and backwards cells
lstm_fw_cell = tf.compat.v1.nn.rnn_cell.LSTMCell(num_units=lstm_units)
lstm_bw_cell = tf.compat.v1.nn.rnn_cell.LSTMCell(num_units=lstm_units)
lstm_fw_with_dropout = tf.compat.v1.nn.rnn_cell.DropoutWrapper(lstm_fw_cell, input_keep_prob=dropout_units)
lstm_bw_with_dropout = tf.compat.v1.nn.rnn_cell.DropoutWrapper(lstm_bw_cell, input_keep_prob=dropout_units)
# bidirectional_dynamic_rnn takes a list of tensors with shape
# [batch_size x cell_fw.state_size]
_X = X.transpose()
tf.compat.v1.nn.bidirectional_dynamic_rnn(lstm_fw_with_dropout, lstm_bw_with_dropout, _X, dtype=tf.float64)