google-research / albert

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
Apache License 2.0
3.25k stars 570 forks source link

Is it possible to reload a pretrained tfhub albert in tf2.0? #11

Closed mcggood closed 4 years ago

mcggood commented 4 years ago

for example: ALBERT_PATH = "xxx" // a pretrained tfhub albert model albert_layer = hub.KerasLayer(ALBERT_PATH , trainable=True)

Tony-Y commented 4 years ago

https://github.com/kpe/bert-for-tf2

Tony-Y commented 4 years ago
albert_layer = hub.KerasLayer("https://tfhub.dev/google/albert_xxlarge/2",
    trainable=True, signature='tokens', signature_outputs_as_dict=True)
input_ids = tf.keras.layers.Input(shape=(max_seq_length,),
                                  dtype=tf.int32, name="input_ids")
input_mask = tf.keras.layers.Input(shape=(max_seq_length,),
                                   dtype=tf.int32, name="input_mask")
segment_ids = tf.keras.layers.Input(shape=(max_seq_length,),
                                    dtype=tf.int32, name="segment_ids")
albert_inputs = [input_ids, input_mask, segment_ids]
outputs = albert_layer(inputs=dict(
      input_ids=input_ids,
      input_mask=input_mask,
      segment_ids=segment_ids,
    ))

Unfortunately, trainable=True does not work.

e.g. https://www.kaggle.com/c/google-quest-challenge/discussion/121393#698989

0x0539 commented 4 years ago

Sorry for the delayed response. This looks like a duplicate of #112 We only support ALBERT on TF1.15 at this time. We haven't really tested ALBERT in TF2.0 and don't have plans to release a TF2.0-compatible version of ALBERT modules (yet).