HadoopIt / rnn-nlu

A TensorFlow implementation of Recurrent Neural Networks for Sequence Classification and Sequence Labeling
483 stars 171 forks source link

Update with the recent changes in Tensorflow #7

Closed zuxfoucault closed 7 years ago

zuxfoucault commented 7 years ago

Due to this change in Tensorflow as the suggestion:

writing: MultiRNNCell([lstm] * 5) will now build a 5-layer LSTM stack where each layer shares the same parameters. To get 5 layers each with their own parameters, write: MultiRNNCell([LSTMCell(...) for _ in range(5)]).

Should cell = tf.contrib.rnn.MultiRNNCell([single_cell] * num_layers]) in line be updated to cell = tf.contrib.rnn.MultiRNNCell([single_cell for _ in range(num_layers)])? Thanks!

zuxfoucault commented 7 years ago

Besides, since from tensorflow.python.ops import rnn in line has moved to from tensorflow.contrib import rnn.

I'm not sure which rnn.rnn() in line should be corresponded to in the newer Tensorflow version. Suppose rnn.static_rnn ? Thanks!

raphael-sch commented 7 years ago

You need to instantiate the cell multiple times. Something like this:

# Create the internal multi-layer cell for our 
cell_type = tf.nn.rnn_cell.GRUCell
if use_lstm:
    cell_type = tf.nn.rnn_cell.BasicLSTMCell
cell = cell_type(size)
if num_layers > 1:
    cell = tf.nn.rnn_cell.MultiRNNCell([cell_type(size) for _ in range(num_layers)])`
HadoopIt commented 7 years ago

Thanks @raphael-sch and @zuxfoucault, the code is now updated to work with the latest TensorFlow API r1.2. Sorry for the delayed updates.