Open tuong-olli opened 3 years ago
Hi, Could you help me something about your code? In tacotron.py in line encoder_outputs = encoder_cell(embedded_inputs, tower_input_lengths[i]). You add tower_input_lengths[i] (this is a tensor) to EncoderRNN (this is a BiLSTM) with code:
encoder_outputs = encoder_cell(embedded_inputs, tower_input_lengths[i])
tower_input_lengths[i]
def __call__(self, inputs, input_lengths): with tf.variable_scope(self.scope): outputs, (fw_state, bw_state) = tf.nn.bidirectional_dynamic_rnn( self._fw_cell, self._bw_cell, inputs, sequence_length=input_lengths, dtype=tf.float32, swap_memory=True)
So sequence_length have to int type not tensor, but how you can do with tower_input_lengths[i], this is a tensor.
sequence_length
int type not tensor
Hi, Could you help me something about your code? In tacotron.py in line
encoder_outputs = encoder_cell(embedded_inputs, tower_input_lengths[i])
. You addtower_input_lengths[i]
(this is a tensor) to EncoderRNN (this is a BiLSTM) with code:So
sequence_length
have toint type not tensor
, but how you can do withtower_input_lengths[i]
, this is a tensor.