Open lucasjinreal opened 6 years ago
you should downgrade tensorflow to version 1.3.0
did you fix it?
Not yet , do you have any idea??
-----Original Message----- From: "Anirudh Singh"notifications@github.com To: "carpedm20/multi-speaker-tacotron-tensorflow"multi-speaker-tacotron-tensorflow@noreply.github.com; Cc: "Subscribed"subscribed@noreply.github.com; Sent: 2018-10-10 (수) 00:37:05 Subject: Re: [carpedm20/multi-speaker-tacotron-tensorflow] AttentionWrapperState Missing attention_state argument (#48)
did you fix it? — You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.
because of this depreciated TensorFlow version ...
you could add in model/ rnn_wrappers.py in function state_size, zero_state
add those required attention_state
return AttentionWrapperState( .... attention_state=self._item_or_tuple( attention_mechanism.initial_state(batch_size, dtype) for attention_mechanism in self._attention_mechanisms),
....)
Hi @Ella77. After doing that, I get the following error:
File "train.py", line 336, in <module> main() File "train.py", line 332, in main train(config.model_dir, config) File "train.py", line 160, in train is_randomly_initialized=is_randomly_initialized) File "/mnt/batch/tasks/shared/LS_root/mounts/clusters/multivocal/code/Users/frederik/multi-speaker-tacotron-tensorflow/models/tacotron.py", line 170, in initialize cells = [OutputProjectionWrapper(concat_cell, hp.dec_rnn_size)] File "/home/azureuser/.local/lib/python3.6/site-packages/tensorflow/contrib/rnn/python/ops/core_rnn_cell.py", line 356, in __init__ rnn_cell_impl.assert_like_rnncell("cell", cell) File "/home/azureuser/.local/lib/python3.6/site-packages/tensorflow/python/ops/rnn_cell_impl.py", line 91, in assert_like_rnncell _hasattr(cell, "output_size"), File "/home/azureuser/.local/lib/python3.6/site-packages/tensorflow/python/ops/rnn_cell_impl.py", line 68, in _hasattr getattr(obj, attr_name) File "/mnt/batch/tasks/shared/LS_root/mounts/clusters/multivocal/code/Users/frederik/multi-speaker-tacotron-tensorflow/models/rnn_wrappers.py", line 406, in output_size return self._cell.output_size + self._cell.state_size.attention File "/mnt/batch/tasks/shared/LS_root/mounts/clusters/multivocal/code/Users/frederik/multi-speaker-tacotron-tensorflow/models/rnn_wrappers.py", line 180, in state_size for attention_mechanism in self._attention_mechanisms), File "/mnt/batch/tasks/shared/LS_root/mounts/clusters/multivocal/code/Users/frederik/multi-speaker-tacotron-tensorflow/models/rnn_wrappers.py", line 162, in _item_or_tuple t = tuple(seq) File "/mnt/batch/tasks/shared/LS_root/mounts/clusters/multivocal/code/Users/frederik/multi-speaker-tacotron-tensorflow/models/rnn_wrappers.py", line 180, in <genexpr> for attention_mechanism in self._attention_mechanisms), NameError: name 'batch_size' is not defined
Do you know a fix for this?
Hi, when runing the codes, there are error says:
The newest tensorflow needs an attention_state argument, how to adding that? Inside this function.