Kyubyong / tacotron

A TensorFlow Implementation of Tacotron: A Fully End-to-End Text-To-Speech Synthesis Model
Apache License 2.0
1.83k stars 436 forks source link

Another problem #89

Open GGLW123 opened 7 years ago

GGLW123 commented 7 years ago

assertion failed: [When calling zero_state of AttentionWrapper attention_wrapper: Non-matching batch sizes between the memory (encoder output) and the requested batch size. Are you using the BeamSearchDecoder? If so, make sure your encoder output has been tiled to beam_width via tf.contrib.seq2seq.tile_batch, and the batch_size= argument passed to zero_state is batch_size * beam_width.] [Condition x == y did not hold element-wise:] [x (net/decoder1/attention_decoder/rnn/strided_slice:0) = ] [32] [y (net/decoder1/attention_decoder/BahdanauAttention/strided_slice_1:0) = ] [15]

 [[Node: net/decoder1/attention_decoder/rnn/AttentionWrapperZeroState/assert_equal/Assert/Assert = Assert[T=[DT_STRING, DT_STRING, DT_STRING, DT_INT32, DT_STRING, DT_INT32], summarize=3, _device="/job:localhost/replica:0/task:0/cpu:0"](net/decoder1/attention_decoder/rnn/AttentionWrapperZeroState/assert_equal/All/_891, net/decoder1/attention_decoder/rnn/AttentionWrapperZeroState/assert_equal/Assert/Assert/data_0, net/decoder1/attention_decoder/rnn/AttentionWrapperZeroState/assert_equal/Assert/Assert/data_1, net/decoder1/attention_decoder/rnn/AttentionWrapperZeroState/assert_equal/Assert/Assert/data_2, net/decoder1/attention_decoder/rnn/strided_slice/_893, net/decoder1/attention_decoder/rnn/AttentionWrapperZeroState/assert_equal/Assert/Assert/data_4, net/decoder1/attention_decoder/BahdanauAttention/strided_slice_1/_895)]]

 [[Node: net/decoder1/attention_decoder/rnn/while/rnn/attention_wrapper/assert_equal/Assert/Assert/_910 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/gpu:0", send_device="/job:localhost/replica:0/task:0/cpu:0", send_device_incarnation=1, tensor_name="edge_1774_net/decoder1/attention_decoder/rnn/while/rnn/attention_wrapper/assert_equal/Assert/Assert", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/gpu:0"](^_cloopnet/decoder1/attention_decoder/rnn/while/rnn/attention_wrapper/checked_cell_output/_512)]]

Caused by op 'net/decoder1/attention_decoder/rnn/AttentionWrapperZeroState/assert_equal/Assert/Assert', defined at:

File "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\Extensions\Microsoft\Python Tools for Visual Studio\2.2\visualstudio_py_launcher.py", line 78, in

vspd.debug(filename, port_num, debug_id, debug_options, run_as)

File "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\Extensions\Microsoft\Python Tools for Visual Studio\2.2\visualstudio_py_debugger.py", line 2483, in debug

exec_file(file, globals_obj)

File "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\Extensions\Microsoft\Python Tools for Visual Studio\2.2\visualstudio_py_util.py", line 111, in exec_file

exec_code(code, file, global_variables)

File "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\Extensions\Microsoft\Python Tools for Visual Studio\2.2\visualstudio_py_util.py", line 87, in exec_code

exec(code_obj, global_variables)

File "C:\Users\User\Documents\Visual Studio 2015\Projects\Tachatron\Tachatron\eval.py", line 69, in

eval()

File "C:\Users\User\Documents\Visual Studio 2015\Projects\Tachatron\Tachatron\eval.py", line 27, in eval

g = Graph(is_training=False)

File "C:\Users\User\Documents\Visual Studio 2015\Projects\Tachatron\Tachatron\train.py", line 49, in init

is_training=is_training) # (N, T', hp.n_mels*hp.r)

File "C:\Users\User\Documents\Visual Studio 2015\Projects\Tachatron\Tachatron\network.py", line 85, in decode1

dec = attention_decoder(dec, memory, num_units=hp.embed_size) # (N, T', E)

File "C:\Users\User\Documents\Visual Studio 2015\Projects\Tachatron\Tachatron\modules.py", line 251, in attention_decoder

dtype=tf.float32) #( N, T', 16)

File "C:\Users\User\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\ops\rnn.py", line 548, in dynamic_rnn

state = cell.zero_state(batch_size, dtype)

File "C:\Users\User\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\contrib\seq2seq\python\ops\attention_wrapper.py", line 659, in zero_state

message=error_message)]):

File "C:\Users\User\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\ops\check_ops.py", line 318, in assert_equal

return control_flow_ops.Assert(condition, data, summarize=summarize)

File "C:\Users\User\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\util\tf_should_use.py", line 170, in wrapped

return _add_should_use_warning(fn(*args, **kwargs))

File "C:\Users\User\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\ops\control_flow_ops.py", line 124, in Assert

condition, data, summarize, name="Assert")

File "C:\Users\User\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\ops\gen_logging_ops.py", line 37, in _assert

summarize=summarize, name=name)

File "C:\Users\User\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\framework\op_def_library.py", line 767, in apply_op

op_def=op_def)

File "C:\Users\User\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\framework\ops.py", line 2506, in create_op

original_op=self._default_original_op, op_def=op_def)

File "C:\Users\User\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\framework\ops.py", line 1269, in init

self._traceback = _extract_stack()

InvalidArgumentError (see above for traceback): assertion failed: [When calling zero_state of AttentionWrapper attention_wrapper: Non-matching batch sizes between the memory (encoder output) and the requested batch size. Are you using the BeamSearchDecoder? If so, make sure your encoder output has been tiled to beam_width via tf.contrib.seq2seq.tile_batch, and the batch_size= argument passed to zero_state is batch_size * beam_width.] [Condition x == y did not hold element-wise:] [x (net/decoder1/attention_decoder/rnn/strided_slice:0) = ] [32] [y (net/decoder1/attention_decoder/BahdanauAttention/strided_slice_1:0) = ] [15]

 [[Node: net/decoder1/attention_decoder/rnn/AttentionWrapperZeroState/assert_equal/Assert/Assert = Assert[T=[DT_STRING, DT_STRING, DT_STRING, DT_INT32, DT_STRING, DT_INT32], summarize=3, _device="/job:localhost/replica:0/task:0/cpu:0"](net/decoder1/attention_decoder/rnn/AttentionWrapperZeroState/assert_equal/All/_891, net/decoder1/attention_decoder/rnn/AttentionWrapperZeroState/assert_equal/Assert/Assert/data_0, net/decoder1/attention_decoder/rnn/AttentionWrapperZeroState/assert_equal/Assert/Assert/data_1, net/decoder1/attention_decoder/rnn/AttentionWrapperZeroState/assert_equal/Assert/Assert/data_2, net/decoder1/attention_decoder/rnn/strided_slice/_893, net/decoder1/attention_decoder/rnn/AttentionWrapperZeroState/assert_equal/Assert/Assert/data_4, net/decoder1/attention_decoder/BahdanauAttention/strided_slice_1/_895)]]

 [[Node: net/decoder1/attention_decoder/rnn/while/rnn/attention_wrapper/assert_equal/Assert/Assert/_910 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/gpu:0", send_device="/job:localhost/replica:0/task:0/cpu:0", send_device_incarnation=1, tensor_name="edge_1774_net/decoder1/attention_decoder/rnn/while/rnn/attention_wrapper/assert_equal/Assert/Assert", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/gpu:0"](^_cloopnet/decoder1/attention_decoder/rnn/while/rnn/attention_wrapper/checked_cell_output/_512)]]
dileepfrog commented 6 years ago

Did you ever figure this out