suinleelab / path_explain

A repository for explaining feature attributions and feature interactions in deep neural networks.
MIT License
185 stars 28 forks source link

output_indices not passed through for torch interactions #3

Closed jumelet closed 3 years ago

jumelet commented 4 years ago

Hi! I'm trying to run your interaction setup on a torch model (from Huggingface's library), but I run into trouble because the output_indices doesn't seem to be passed through to the attributions method.

https://github.com/suinleelab/path_explain/blob/673ee950dd770925b51b299c7276fc2c2ed8febd/path_explain/explainers/path_explainer_torch.py#L245-L246

This causes an error in _get_grad, when output_indices is trying to be unsqueezed, but because it's not passed it is still set to None: https://github.com/suinleelab/path_explain/blob/673ee950dd770925b51b299c7276fc2c2ed8febd/path_explain/explainers/path_explainer_torch.py#L129-L131

jjanizek commented 4 years ago

Great catch, I'll fix that ASAP! The torch model was exclusively getting run for scalar output models for the experiments in our paper, so that's how i must have missed that. Thanks!

Jiang15 commented 3 years ago

Great catch, I'll fix that ASAP! The torch model was exclusively getting run for scalar output models for the experiments in our paper, so that's how i must have missed that. Thanks!

Hi! I'm trying to run your interaction setup on a torch model (from Huggingface's library), but I run into trouble because the output_indices doesn't seem to be passed through to the attributions method.

https://github.com/suinleelab/path_explain/blob/673ee950dd770925b51b299c7276fc2c2ed8febd/path_explain/explainers/path_explainer_torch.py#L245-L246

This causes an error in _get_grad, when output_indices is trying to be unsqueezed, but because it's not passed it is still set to None:

https://github.com/suinleelab/path_explain/blob/673ee950dd770925b51b299c7276fc2c2ed8febd/path_explain/explainers/path_explainer_torch.py#L129-L131

Hi, do you how to solve this issue? I ran into the same error.