OnlpLab / AlephBERT

Apache License 2.0
47 stars 9 forks source link

Using AlephBERT in evaluation mode from allennlp #3

Open yuvalkry opened 2 years ago

yuvalkry commented 2 years ago

Hi,

I am loading AlephBERT using the "transformers" package via allennlp using the following jsonnet definition:

"token_embedders": {             "bert": {                 "type": "pretrained_transformer",                 "model_name": "onlplab/alephbert-base",                 "eval_mode": true,             }         }

The flag "eval_mode" indicates that I am not going to train the model, I am using it in order to calculate embeddings.

When running, the module which loads the model (transformers.modeling_utils) issues the following message:

Some weights of BertModel were not initialized from the model checkpoint at onlplab/alephbert-base and are newly initialized: ['bert.pooler.dense.weight', 'bert.pooler.dense.bias'] You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

Do these weights affect the calculation of embeddings? if so, how to fix it?

Thanks, Yuval