lsdefine / attention-is-all-you-need-keras

A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need
708 stars 188 forks source link

LayerNormalization #1

Closed AMSakhnov closed 6 years ago

AMSakhnov commented 6 years ago

It takes to change

super().__init__(**kwargs) on super(LayerNormalization, self).__init__(**kwargs)

and

super().build(input_shape) on super(LayerNormalization, self).build(input_shape)

in class class LayerNormalization(Layer):

Python 2.7

lsdefine commented 6 years ago

Thanks. Updated.