cdpierse / transformers-interpret

Model explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Apache License 2.0
1.27k stars 96 forks source link

TokenClassificationExplainer Example does not work (LIGAttributions receives unexpected keyword argument "target") #97

Closed pharnisch closed 2 years ago

pharnisch commented 2 years ago

Hello, I want to use the TokenClassificationExplainer. But when I run the example from your README.md, I receive the error

Traceback (most recent call last):
  File "trans_int.py", line 14, in <module>
    word_attributions = ner_explainer(sample_text, ignored_labels=['O'])
  File "/vol/fob-vol7/mi19/harnisph/ti/transformers_interpret/explainers/token_classification.py", line 296, in __call__
    return self._run(text, embedding_type=embedding_type)
  File "/vol/fob-vol7/mi19/harnisph/ti/transformers_interpret/explainers/token_classification.py", line 264, in _run
    self._calculate_attributions(embeddings=embeddings)
  File "/vol/fob-vol7/mi19/harnisph/ti/transformers_interpret/explainers/token_classification.py", line 234, in _calculate_attributions
    n_steps=self.n_steps,
TypeError: __init__() got an unexpected keyword argument 'target'

After debugging I saw, that the constructor of LIGAttributions receives target=7 and in the class itself it has an argument target: Optional[Union[int, Tuple, torch.Tensor, List]] = None, so I dont understand why it struggles with this argument as it is obviously there and should also be a valid value type.

Maybe someone can help me to understand/fix this. Thank you in advance.

cdpierse commented 2 years ago

Hi @pharnisch thanks for flagging this with me, can I ask what version you are using. There had been some bugs with versions 0.7.0-0.7.4 with the new explainer which I have fixed now. I just ran the readme example for token classification in a fresh environment and it seems to work for me.

pharnisch commented 2 years ago

Thank you for your fast response. Indeed, the problem could be due to versioning since I could only install version 0.6.0, but when I try to install the newest version, it does not work:

pip install transformers-interpret==0.7.5
Collecting transformers-interpret==0.7.5
  Using cached transformers_interpret-0.7.5-py3-none-any.whl (38 kB)
Requirement already satisfied: transformers>=3.0.0 in ./environments/rare-facts/lib/python3.6/site-packages (from transformers-interpret==0.7.5) (4.17.0)
Collecting pytest<6.0.0,>=5.4.2
  Using cached pytest-5.4.3-py3-none-any.whl (248 kB)
ERROR: Could not find a version that satisfies the requirement ipython<8.0.0,>=7.31.1 (from transformers-interpret) (from versions: 0.10, 0.10.1, 0.10.2, 0.11, 0.12, 0.12.1, 0.13, 0.13.1, 0.13.2, 1.0.0, 1.1.0, 1.2.0, 1.2.1, 2.0.0, 2.1.0, 2.2.0, 2.3.0, 2.3.1, 2.4.0, 2.4.1, 3.0.0, 3.1.0, 3.2.0, 3.2.1, 3.2.2, 3.2.3, 4.0.0b1, 4.0.0, 4.0.1, 4.0.2, 4.0.3, 4.1.0rc1, 4.1.0rc2, 4.1.0, 4.1.1, 4.1.2, 4.2.0, 4.2.1, 5.0.0b1, 5.0.0b2, 5.0.0b3, 5.0.0b4, 5.0.0rc1, 5.0.0, 5.1.0, 5.2.0, 5.2.1, 5.2.2, 5.3.0, 5.4.0, 5.4.1, 5.5.0, 5.6.0, 5.7.0, 5.8.0, 5.9.0, 5.10.0, 6.0.0rc1, 6.0.0, 6.1.0, 6.2.0, 6.2.1, 6.3.0, 6.3.1, 6.4.0, 6.5.0, 7.0.0b1, 7.0.0rc1, 7.0.0, 7.0.1, 7.1.0, 7.1.1, 7.2.0, 7.3.0, 7.4.0, 7.5.0, 7.6.0, 7.6.1, 7.7.0, 7.8.0, 7.9.0, 7.10.0, 7.10.1, 7.10.2, 7.11.0, 7.11.1, 7.12.0, 7.13.0, 7.14.0, 7.15.0, 7.16.0, 7.16.1, 7.16.2, 7.16.3)
ERROR: No matching distribution found for ipython<8.0.0,>=7.31.1

I have installed: Python 3.6.15 (cannot change this sadly) Pytorch 1.10.2+cu113 transformers 4.17.0 Captum 0.3.1

When I get it correct, my Python version is not high enough to install a high enough ipython package but then the required Python version standing in the README.md is not right as I fulfill all of these four criteria.

Anyway, I just created a fork and removed the ipython dependency and now it works for my case. Thanks you! :)