Open Neel125 opened 4 years ago
Facing the same issue while try to use tensorflow_addons with tf V2.X
Facing the same issue with tensorflow version 2.x
Got the solution : Just replace zero_state with get_inital_state, because the function get_initial_state returns an AttentionWrapperState tuple containing zeroed out tensors same as zero_state
Hello @princebaretto99 I have already found this solution the same day I encountered this issue but really sorry because I forget to update it here in github.Zero_state issue is resolved by using get_initial_state; Thank you for your solution.
Error: File "/home/ml-ai4/Neel-dev023/ChatBot/nmt-chatbot/nmt/nmt/attention_model.py", line 144, in _build_decoder_cell decoder_initial_state = cell.zero_state(batch_size=batch_size*hparams["beam_width"], dtype=dtype).clone( File "/home/ml-ai4/Neel-dev023/ChatBot/nmt-chatbot/venv/lib/python3.6/site-packages/tensorflow_core/python/ops/rnn_cell_wrapper_impl.py", line 199, in zero_state return self.cell.zero_state(batch_size, dtype) File "/home/ml-ai4/Neel-dev023/ChatBot/nmt-chatbot/venv/lib/python3.6/site-packages/tensorflow_core/python/ops/rnn_cell_wrapper_impl.py", line 431, in zero_state return self.cell.zero_state(batch_size, dtype) AttributeError: 'AttentionWrapper' object has no attribute 'zero_state'