allenai / bilm-tf

Tensorflow implementation of contextualized word representations from bi-directional language models
Apache License 2.0
1.62k stars 452 forks source link

How to use ELMo in Keras version? #227

Open LucasJau opened 4 years ago

LucasJau commented 4 years ago

For some reason, I have to use ELMo in Keras platform. When I try to write a Keras Layer for ELMo, I notice that when the input of BidirectionalLanguageModel is keras.layer.Input, the whole program will be stuck, but if the input is tf.placeholder, it will go through the code successfully but Keras model doesn't allow a non-Input input layer. How can I fix this? Please help me.

The layer is implemented like:

` class ElmoEmbeddingLayer(Layer):

def __init__(self, config, **kwargs):
     self.dimensions = 200
     self.options_file = config.options_file
     self.weights_file = config.weights_file
     self.token_embedding_file = config.token_embedding_file
     self.bilm = BidirectionalLanguageModel(
         self.options_file,
         self.weights_file,
         use_character_inputs=False,
         embedding_weight_file=self.token_embedding_file,
         max_batch_size=1024
     )
     super(ElmoEmbeddingLayer, self).__init__(**kwargs)

def build(self, input_shape):
    super(ElmoEmbeddingLayer, self).build(input_shape)

def call(self, x, mask=None):
    context_embeddings_op = self.bilm(x)
    elmo_embedding = weight_layers('elmo_output', context_embeddings_op, l2_coef=0.0)
    elmo_embedding = elmo_embedding['weighted_op']
    return elmo_embedding

def compute_mask(self, inputs, mask=None):
    return K.not_equal(inputs, 0)

def compute_output_shape(self, input_shape):
    return input_shape[0], input_shape[1], self.dimensions`

And when I use it I just:

`
elmo_model = ElmoEmbeddingLayer(self.data_config)

tmp = tf.placeholder(tf.int32, shape=(None, None))

char_ids = Input(batch_shape=(None, None), dtype='int32', name='input_ids')

elmo_embeddings = elmo_model(char_ids)

lstm_output_1 = Bidirectional(LSTM(units=_char_lstm_size, return_sequences=True))(elmo_embeddings) ` By using "char_ids", it will be stuck; if I use "tmp", the model can't be formed for regarding non-Input layers as the input of model.

pinesnow72 commented 3 years ago

Did you solve this issue? I am also finding how to make keras layer of elmo, but stuck like you.