cdpierse / transformers-interpret

Model explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Apache License 2.0
1.27k stars 96 forks source link

AttributeError: 'str' object has no attribute 'config' #80

Closed clechristophe closed 2 years ago

clechristophe commented 2 years ago

Hi @cdpierse , your project looks very nice, congrats !

I have a problem with the SequenceClassificationExplainer() that is not working with any classification model in input (including the DistilBertForSequenceClassification you used in your towardsdatascience article). I have the following error:

AttributeError                            Traceback (most recent call last)
/tmp/ipykernel_21316/2774477167.py in <module>
      1 from transformers_interpret import SequenceClassificationExplainer
----> 2 cls_explainer = SequenceClassificationExplainer("I love you, I like you", model, tokenizer)

/dds/miniconda/envs/py39/lib/python3.9/site-packages/transformers_interpret/explainers/sequence_classification.py in __init__(self, model, tokenizer, attribution_type, custom_labels)
     55             AttributionTypeNotSupportedError:
     56         """
---> 57         super().__init__(model, tokenizer)
     58         if attribution_type not in SUPPORTED_ATTRIBUTION_TYPES:
     59             raise AttributionTypeNotSupportedError(

/dds/miniconda/envs/py39/lib/python3.9/site-packages/transformers_interpret/explainer.py in __init__(self, model, tokenizer)
     17         self.tokenizer = tokenizer
     18 
---> 19         if self.model.config.model_type == "gpt2":
     20             self.ref_token_id = self.tokenizer.eos_token_id
     21         else:

AttributeError: 'str' object has no attribute 'config'

I suspect that it might be a version problem with HuggingFace transformers lib. I'm currently using transformers 4.16.2 and transformers-interpret 0.6.0.

Thanks for your help

cdpierse commented 2 years ago

Hi @clechristophe,

Thank you, and thanks for using the package.

I think the fix here might be an easy one. This PR https://github.com/cdpierse/transformers-interpret/pull/27 introduced some breaking changes that no longer allow the text to be explained to be passed in the explainer constructor, instead, the class instance returned is callable, and you use that to generate explanations. So now instead of what you did, you would define it as:

cls_explainer = SequenceClassificationExplainer(model, tokenizer)
cls_explainer("I like you, I love you")

Let me know if this works.

Thanks, Charles

clechristophe commented 2 years ago

Hi @cdpierse

Thank you for your quick response !

It works perfectly, I didn't see the updated documentation on the readme...

Clément