allenai / allennlp

An open-source NLP research library, built on PyTorch.
http://www.allennlp.org
Apache License 2.0
11.76k stars 2.25k forks source link

Configuration error on coreference resolution while using model coref-bert-lstm-2020.02.12. #3792

Closed lighteternal closed 4 years ago

lighteternal commented 4 years ago

Hi, While trying to execute a simple query for co-reference resolution using the latest bert-lstm model as follows:

from allennlp.predictors.predictor import Predictor
predictor = Predictor.from_path("https://storage.googleapis.com/allennlp-public-models/coref-bert-lstm-2020.02.12.tar.gz")
predictor.predict(
  document="I went to the shop to buy my new sweater, it was very warm."

)

I get this configuration error: ConfigurationError: "pretrained_transformer_mismatched not in acceptable choices for model.text_field_embedder.token_embedders.tokens.type: ['embedding', 'character_encoding', 'elmo_token_embedder', 'elmo_token_embedder_multilang', 'openai_transformer_embedder', 'bert-pretrained', 'language_model_token_embedder', 'bidirectional_lm_token_embedder', 'bag_of_word_counts', 'pass_through']"

This error does not appear while using the older coref-model-2018.02.05 model. I suspect that the new model misses some of the classes that the old one had. Can you please advise? Many thanks

DeNeutoy commented 4 years ago

This model is not compatible with allennlp 0.9 - it requires the master branch. Sorry about that!

lighteternal commented 4 years ago

Thank you for the prompt reply. I understand that this is a foreseen update listed in the v1.0 prerelease notes. So by cloning the master branch, the import procedure is identical to the v0.9 one?

DeNeutoy commented 4 years ago

Should be, yes.

lighteternal commented 4 years ago

Will try then, many thanks!

Update: It works with allennlp v.0.9.1.