Closed izaskr closed 4 years ago
I don't think you can run the "coref-spanbert-large-2020.02.27.tar.gz" model with AllenNLP v0.9.0
You need a newer version to run this model. Try these packages if you want the same results as the demo:
AllenNLP package : https://github.com/allenai/allennlp/tree/adeb1b1278619ff2d74d4fd82825e50a36f95ff4
Model package : https://github.com/allenai/allennlp-models/tree/baf3a1ec3b74273a4ffa2112d37fb88e8b3dd39c
Isn't 0.9.0
the newest official release of AllenNLP?
I have installed the packages you suggested into a clean conda env as follows:
pip install git+git://github.com/allenai/allennlp.git@adeb1b1278619ff2d74d4fd82825e50a36f95ff4
pip install git+https://github.com/allenai/allennlp-models.git@baf3a1ec3b74273a4ffa2112d37fb88e8b3dd39c
This installs the versions:
When using the code from the demo, I still get the error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/CE/skr/anaconda3/envs/env_allennlp3/lib/python3.7/site-packages/allennlp/predictors/predictor.py", line 268, in from_path
load_archive(archive_path, cuda_device=cuda_device),
File "/home/CE/skr/anaconda3/envs/env_allennlp3/lib/python3.7/site-packages/allennlp/models/archival.py", line 186, in load_archive
cuda_device=cuda_device,
File "/home/CE/skr/anaconda3/envs/env_allennlp3/lib/python3.7/site-packages/allennlp/models/model.py", line 340, in load
model_class: Type[Model] = cls.by_name(model_type) # type: ignore
File "/home/CE/skr/anaconda3/envs/env_allennlp3/lib/python3.7/site-packages/allennlp/common/registrable.py", line 137, in by_name
subclass, constructor = cls.resolve_class_name(name)
File "/home/CE/skr/anaconda3/envs/env_allennlp3/lib/python3.7/site-packages/allennlp/common/registrable.py", line 185, in resolve_class_name
f"{name} is not a registered name for {cls.__name__}. "
allennlp.common.checks.ConfigurationError: coref is not a registered name for Model. You probably need to use the --include-package flag to load your custom code. Alternatively, you can specify your choices using fully-qualified paths, e.g. {"model": "my_module.models.MyModel"} in which case they will be automatically imported correctly.
Any ideas?
use this code
from allennlp.predictors import Predictor
import allennlp_models.coref
coref_model = Predictor.from_path('..../coref-spanbert-large-2020.02.27.tar.gz')
coref_model.coref_resolved(text)
or
coref_model.predict(text)
Thank you, your suggestions worked.
Alternatively, what one can do to load the model is:
pip install allennlp==0.9.1.dev20200228
Thank you, your suggestions worked.
Alternatively, what one can do to load the model is:
- install the dev version 0.9.1 with
pip install allennlp==0.9.1.dev20200228
- use the code snippet from the demo, so no need to install the allennlp_models or import them
Hi! I am currently using 0.9.1 without using allennlp_models, but I still could not have anything with the code in demo. It seems like the program freezes without giving any output. Do you have any idea about this issue? Thank you!
I didn't have a similar problem. What are you system specifications? Did you install the packages into a clean environment?
They released 1.0.0rc1 yesterday and after installing on a clean environment, I am able to use spanbert now!
Describe the bug Loading the model for coreference resolution seems to not work: the model is not loaded/recognized.
Log:
Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/CE/skr/anaconda3/envs/env_allennlp2/lib/python3.7/site-packages/allennlp/predictors/predictor.py", line 256, in from_path return Predictor.from_archive(load_archive(archive_path, cuda_device=cuda_device), predictor_name, File "/home/CE/skr/anaconda3/envs/env_allennlp2/lib/python3.7/site-packages/allennlp/models/archival.py", line 230, in load_archive cuda_device=cuda_device) File "/home/CE/skr/anaconda3/envs/env_allennlp2/lib/python3.7/site-packages/allennlp/models/model.py", line 327, in load return cls.by_name(model_type)._load(config, serialization_dir, weights_file, cuda_device) File "/home/CE/skr/anaconda3/envs/env_allennlp2/lib/python3.7/site-packages/allennlp/models/model.py", line 265, in _load model = Model.from_params(vocab=vocab, params=model_params) File "/home/CE/skr/anaconda3/envs/env_allennlp2/lib/python3.7/site-packages/allennlp/common/from_params.py", line 365, in from_params return subclass.from_params(params=params, **extras) File "/home/CE/skr/anaconda3/envs/env_allennlp2/lib/python3.7/site-packages/allennlp/common/from_params.py", line 386, in from_params kwargs = create_kwargs(cls, params, **extras) File "/home/CE/skr/anaconda3/envs/env_allennlp2/lib/python3.7/site-packages/allennlp/common/from_params.py", line 133, in create_kwargs kwargs[name] = construct_arg(cls, name, annotation, param.default, params, **extras) File "/home/CE/skr/anaconda3/envs/env_allennlp2/lib/python3.7/site-packages/allennlp/common/from_params.py", line 229, in construct_arg return annotation.from_params(params=subparams, **subextras) File "/home/CE/skr/anaconda3/envs/env_allennlp2/lib/python3.7/site-packages/allennlp/common/from_params.py", line 365, in from_params return subclass.from_params(params=params, **extras) File "/home/CE/skr/anaconda3/envs/env_allennlp2/lib/python3.7/site-packages/allennlp/modules/text_field_embedders/basic_text_field_embedder.py", line 160, in from_params for name, subparams in token_embedder_params.items() File "/home/CE/skr/anaconda3/envs/env_allennlp2/lib/python3.7/site-packages/allennlp/modules/text_field_embedders/basic_text_field_embedder.py", line 160, in <dictcomp> for name, subparams in token_embedder_params.items() File "/home/CE/skr/anaconda3/envs/env_allennlp2/lib/python3.7/site-packages/allennlp/common/from_params.py", line 359, in from_params default_to_first_choice=default_to_first_choice) File "/home/CE/skr/anaconda3/envs/env_allennlp2/lib/python3.7/site-packages/allennlp/common/params.py", line 363, in pop_choice raise ConfigurationError(message) allennlp.common.checks.ConfigurationError: 'pretrained_transformer_mismatched not in acceptable choices for model.text_field_embedder.token_embedders.tokens.type: [\'embedding\', \'character_encoding\', \'elmo_token_embedder\', \'elmo_token_embedder_multilang\', \'openai_transformer_embedder\', \'bert-pretrained\', \'language_model_token_embedder\', \'bidirectional_lm_token_embedder\', \'bag_of_word_counts\', \'pass_through\', \'pretrained_transformer\']. You should either use the --include-package flag to make sure the correct module is loaded, or use a fully qualified class name in your config file like {"model": "my_module.models.MyModel"} to have it imported automatically.'
To Reproduce The code:
from allennlp.predictors.predictor import Predictor
predictor = Predictor.from_path("https://storage.googleapis.com/allennlp-public-models/coref-spanbert-large-2020.02.27.tar.gz")
predictor.predict(document="The woman reading a newspaper sat on the bench with her dog.")
The error is raised after defining the predictor variable.Expected behavior Load the model with no errors.
System (please complete the following information):
Additional context I was following the demo here. I also looked at this example - the part of loading the coreference model, but the same error was raised. Using the
predictor_name
argument found here didn't solve the issue either.