aymericdamien / TensorFlow-Examples

TensorFlow Tutorial and Examples for Beginners (support TF v1 & v2)
Other
43.43k stars 14.94k forks source link

BasicLSTMCell variable scope #145

Open clu5 opened 7 years ago

clu5 commented 7 years ago
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-7-9e18402660aa> in <module>()
----> 1 pred = RNN(x, weights, biases)
      2 
      3 # Define loss and optimizer
      4 cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=pred, labels=y))
      5 optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)

<ipython-input-6-2186d473735d> in RNN(x, weights, biases)
     12 
     13     # Get lstm cell output
---> 14     outputs, states = rnn.static_rnn(lstm_cell, x, dtype=tf.float32)
     15 
     16     # Linear activation, using rnn inner loop last output

/home/clu/anaconda3/lib/python3.5/site-packages/tensorflow/contrib/rnn/python/ops/core_rnn.py in static_rnn(cell, inputs, initial_state, dtype, sequence_length, scope)
    195             state_size=cell.state_size)
    196       else:
--> 197         (output, state) = call_cell()
    198 
    199       outputs.append(output)

/home/clu/anaconda3/lib/python3.5/site-packages/tensorflow/contrib/rnn/python/ops/core_rnn.py in <lambda>()
    182       if time > 0: varscope.reuse_variables()
    183       # pylint: disable=cell-var-from-loop
--> 184       call_cell = lambda: cell(input_, state)
    185       # pylint: enable=cell-var-from-loop
    186       if sequence_length is not None:

/home/clu/anaconda3/lib/python3.5/site-packages/tensorflow/contrib/rnn/python/ops/core_rnn_cell_impl.py in __call__(self, inputs, state, scope)
    233   def __call__(self, inputs, state, scope=None):
    234     """Long short-term memory cell (LSTM)."""
--> 235     with _checked_scope(self, scope or "basic_lstm_cell", reuse=self._reuse):
    236       # Parameters of gates are concatenated into one multiply for efficiency.
    237       if self._state_is_tuple:

/home/clu/anaconda3/lib/python3.5/contextlib.py in __enter__(self)
     57     def __enter__(self):
     58         try:
---> 59             return next(self.gen)
     60         except StopIteration:
     61             raise RuntimeError("generator didn't yield") from None

/home/clu/anaconda3/lib/python3.5/site-packages/tensorflow/contrib/rnn/python/ops/core_rnn_cell_impl.py in _checked_scope(cell, scope, reuse, **kwargs)
     91             "To share the weights of an RNNCell, simply "
     92             "reuse it in your second calculation, or create a new one with "
---> 93             "the argument reuse=True." % (scope_name, type(cell).__name__))
     94 
     95     # Everything is OK.  Update the cell's scope and yield it.

ValueError: Attempt to have a second RNNCell use the weights of a variable scope that already has weights: 'rnn/basic_lstm_cell'; and the cell was not constructed as BasicLSTMCell(..., reuse=True).  To share the weights of an RNNCell, simply reuse it in your second calculation, or create a new one with the argument reuse=True.

Looks like new TF version breaks the examples for RNN. Should update each LSTM cell to be its own variable scope like here.

BrambleXu commented 7 years ago

I met this error too. Did you solve this error by using the solution from here. This might be also helpful. But I am not sure how to change the code according to the solutions.

JasonHanG commented 7 years ago

If you run the cell which define those variable twice ,you will meet this error .The reason is at the second time you are really doing this : "Attempt to have a second RNNCell use the weights of a variable scope that already has weight" . You can not define those variables twice , which will cause conflict. The solution is obvious : Restart your kernel/notebook ,and the variable you have defined before will be clean up, in a new world ,everything works fine.