UKPLab / sentence-transformers

State-of-the-Art Text Embeddings
https://www.sbert.net
Apache License 2.0
15.41k stars 2.49k forks source link

keras layer #188

Open shainaraza opened 4 years ago

shainaraza commented 4 years ago

Can I use keras layer with bert embeddings from sentence-transformers?

[
    merged = Dot(name = 'dot_product', normalize = True, axes = 1)([h_embeddings, s_embeddings])

    # Reshape to be a single number (shape will be (None, 1))
    merged = Reshape(target_shape = [1])(merged)

    # If classifcation, add extra layer and loss function is binary cross entropy
    if classification:
        merged = Dense(1, activation = 'sigmoid')(merged)
        model = Model(inputs = [headline, snippet], outputs = merged)
        model.compile(optimizer = 'Adam', loss = 'binary_crossentropy', metrics = ['accuracy'])

    # Otherwise loss function is mean squared error
    else:
        model = Model(inputs = ([headline, snippet]), outputs = merged)
        model.compile(optimizer = 'Adam', loss = 'mse')

    return model](url)
nreimers commented 4 years ago

Hi @shainaraza This library works with pytorch, keras is sadly based on tensorflow.

You could create the embeddings and use them as inputs for keras.

Or you use the description from huggingface and convert the fine-tuned BERT to tensorflow and then integrate it into keras.

Best Nils Reimers

shainaraza commented 4 years ago

Thanks a lot for prompt reply and indeed useful