farizrahman4u / seq2seq

Sequence to Sequence Learning with Keras
GNU General Public License v2.0
3.17k stars 845 forks source link

LSTMDecoderCell needs a get_config(self) #188

Open inzaghi250 opened 7 years ago

inzaghi250 commented 7 years ago

I found that the latest commit removes the old get_config(self).

However in init:

def __init__(self, hidden_dim=None, **kwargs):
    if hidden_dim:
        self.hidden_dim = hidden_dim
        kwargs['units'] = hidden_dim
    else:
        self.hidden_dim = kwargs['units']
    super(LSTMDecoderCell, self).__init__(**kwargs)

'units' is involved in the potential config. It may cause unfound key when we serialize and then deserialize LSTMDecoderCell.

abhaikollara commented 7 years ago

@inzaghi250 The kwargs['units'] = hidden_dim line was a bug, it has been removed in the latest commit

inzaghi250 commented 7 years ago

@abhaikollara The remaining line self.hidden_dim = kwargs['units'] still causes KeyError. When LSTMDecoderCell is serialized, the get_config() of ExtendedRNNCell is used because LSTMDecoderCell does not have one. This makes the 'units' attribute unincluded. When deserialized, init is called with default hidden_dim (None), which excecutes the aforementioned line.

abhaikollara commented 7 years ago

@inzaghi250 Update seq2seq, your issue should be solved

inzaghi250 commented 7 years ago

@abhaikollara I see the fix: 12 if hidden_dim: 13 self.hidden_dim = hidden_dim 14 else: 15 self.hidden_dim = self.output_dim 16 super(LSTMDecoderCell, self).init(**kwargs) However, self.output_dim still get uninitialized before use, because it is initialized at Line 16?

inzaghi250 commented 7 years ago

@farizrahman4u Please help check the above issue. It causes simple save & load of models to fail.

farizrahman4u commented 7 years ago

@abhaikollara