pytorch / captum

Model interpretability and understanding for PyTorch
https://captum.ai
BSD 3-Clause "New" or "Revised" License
4.85k stars 489 forks source link

Misleading "AssertionError: Given input is not a torch.Tensor" on layer attribution #745

Open R-N opened 3 years ago

R-N commented 3 years ago

🐛 Bug

Captum's layer attribution does attribution for the layer's output (actual input sample in input attribution = layer output in layer attribution). Captum supports multiple outputs but expects them to be tensors.

A PyTorch module may return other things than tensor, either intentional or bug, and Captum will raise an error since it expects tensors. However, the error message is misleading in this case. It raises "AssertionError: Given input is not a torch.Tensor". I couldn't tell at first that the problem was with the layer's output. Instead, I thought it was the input, because the error implies so.

To Reproduce

Steps to reproduce the behavior:

  1. Create a model with several layers
  2. Have one of the layers return multiple outputs with one of them being None
  3. Do layer attribution with Captum for that layer

Expected behavior

I'm not sure expected behavior means current behavior as expected from the reproduction steps or what I expect to happen...

Current behavior: Captum will raise "AssertionError: Given input is not a torch.Tensor" on that layer call.

What I expect to happen: Captum will raise an error telling me that the layer's output is not a torch.Tensor, instead of "Given input"

Environment

Describe the environment used for Captum


 - Captum / PyTorch Version (e.g., 1.0 / 0.4.0): 0.4.0
 - OS (e.g., Linux): Linux
 - How you installed Captum / PyTorch (`conda`, `pip`, source): pip
 - Build command you used (if compiling from source):
 - Python version: 3.7.11
 - CUDA/cuDNN version: idk
 - GPU models and configuration: idk
 - Any other relevant information: It's a Google Colab environment
bilalsal commented 3 years ago

This is a good proposal!

Thank you very much for bringing this up, @R-N! I will follow up on this.

Bilal