Closed masoudh175 closed 4 years ago
Hi @masoudh175,
If I had to guess, it looks like you didn't install AllenNLP from source. Is that the case?
Unfortunately, this is required for the time being (as noted in the README). Please install AllenNLP from source, then try calling pip install git+https://github.com/JohnGiorgi/DeCLUTR.git
again.
Let me know if that solves your issue.
Yes, that was the problem. Thanks!
I followed the instruction here to install the package. Then I tried to run this code. But I am getting this error:
/opt/anaconda3/envs/allennlp/lib/python3.7/site-packages/declutr/modules/token_embedders/pretrained_transformer_embedder_mlm.py in <module> 7 from allennlp.modules.token_embedders.token_embedder import TokenEmbedder 8 from overrides import overrides ----> 9 from transformers import AutoConfig, AutoModelForMaskedLM 10 11 ImportError: cannot import name 'AutoModelForMaskedLM' from 'transformers' (/opt/anaconda3/envs/allennlp/lib/python3.7/site-packages/transformers/__init__.py)
To solve the issue I ran
pip install -U transformers
which causes this:allennlp 1.0.0 requires torch<1.6.0,>=1.5.0, but you'll have torch 1.6.0 which is incompatible. allennlp 1.0.0 requires transformers<2.12,>=2.9, but you'll have transformers 3.0.2 which is incompatible.
Upgrading transformer library changed the the first error message to this:
/opt/anaconda3/envs/allennlp/lib/python3.7/site-packages/transformers/tokenization_utils_fast.py in _batch_encode_plus(self, batch_text_or_text_pairs, add_special_tokens, padding_strategy, truncation_strategy, max_length, stride, is_pretokenized, pad_to_multiple_of, return_tensors, return_token_type_ids, return_attention_mask, return_overflowing_tokens, return_special_tokens_mask, return_offsets_mapping, return_length, verbose, **kwargs) 311 312 if kwargs: --> 313 raise ValueError(f"Keyword arguments {kwargs} not recognized.") 314 315 # Set the truncation and padding strategy and restore the initial configuration ValueError: Keyword arguments {'add_prefix_space': True} not recognized.
Thanks!