cisnlp / simalign

Obtain Word Alignments using Pretrained Language Models (e.g., mBERT)
MIT License
345 stars 47 forks source link

Error in running align_example.py #21

Closed sriram-c closed 3 years ago

sriram-c commented 3 years ago

Hi,

I am getting the following error while executing the align_example.py

/home/sriram/anaconda3/envs/simalign/lib/python3.7/site-packages/torch/cuda/__init__.py:52: UserWarning: CUDA initialization: Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx (Triggered internally at  /pytorch/c10/cuda/CUDAFunctions.cpp:100.)
  return torch._C._cuda_getDeviceCount() > 0
2020-12-29 12:04:07,522 - simalign.simalign - INFO - Initialized the EmbeddingLoader with model: bert-base-multilingual-cased
Traceback (most recent call last):
  File "align_example.py", line 7, in <module>
    result = model.get_word_aligns(source_sentence, target_sentence)
  File "/home/sriram/anaconda3/envs/simalign/lib/python3.7/site-packages/simalign/simalign.py", line 211, in get_word_aligns
    vectors = self.embed_loader.get_embed_list([src_sent, trg_sent]).cpu().detach().numpy()
  File "/home/sriram/anaconda3/envs/simalign/lib/python3.7/site-packages/simalign/simalign.py", line 66, in get_embed_list
    inputs = self.tokenizer(sent_batch, is_pretokenized=True, padding=True, truncation=True, return_tensors="pt")
  File "/home/sriram/anaconda3/envs/simalign/lib/python3.7/site-packages/transformers/tokenization_utils_base.py", line 2371, in __call__
    **kwargs,
  File "/home/sriram/anaconda3/envs/simalign/lib/python3.7/site-packages/transformers/tokenization_utils_base.py", line 2556, in batch_encode_plus
    **kwargs,
  File "/home/sriram/anaconda3/envs/simalign/lib/python3.7/site-packages/transformers/tokenization_utils.py", line 526, in _batch_encode_plus
    ids, pair_ids = ids_or_pair_ids
ValueError: too many values to unpack (expected 2)

Can you please help to resolve this issue?

Thanks, Sriram

sriram-c commented 3 years ago

I could resolve it by downgrading the transformer from 4.1.1 to 3.1.0.

Thanks, Sriram

masoudjs commented 3 years ago

Hello Sriram,

Thank you for using our tool. We are still using Transformers 3.1.0. We will update it to > 4.1 soon.

mzeidhassan commented 3 years ago

Hi @masoudjs, Sorry to reopen. I am still getting this error with transformers 4.2.2. Is 4.1 supported already? Do I need to downgrade to 3.1.0?

Thanks

pdufter commented 3 years ago

Hi @mzeidhassan, no worries. I think it should be fixed now - I tested it with transformers 4.3.0, but it should work with any recent version. Let me know if you have any issues.

michelleqyhqyh commented 1 year ago

4.3.0

I solve this issure by upgrading transformers to 4.3.0