-
### 🛠 Proposed Refactor
The algorithms supported by captum can be easily extended in the explain module
### Suggest a potential alternative/fix
In captum_explainer.py, supported_methods=[] can be a…
-
## 🐛 Bug
I have been trying to import and using captum in different environments and different machines and I cannot import them (no problem on the installation).
## To Reproduce
For example…
-
-
## 🚀 Feature
Integration of explainability methods for the data format of DGL required in mini-batch training.
## Motivation
Right now it is not possible to use captum for DGL in heterogeneous …
-
## 🐛 Bug
## To Reproduce
Steps to reproduce the behavior:
1. Install Captum using the latest conda around with the latest PyTorch and Python 3.11 on Linux or Windows.
2. Importing `captu…
aywi updated
3 months ago
-
### 🐛 Describe the bug
When I'm trying to use DeepLift explainabiliy method from captum, I'm getting the AssertionError related to dimensionality of the input mask.
Code to reproduce the error is …
-
## 🚀 Feature
Currently, nn.Transformer and related modules return only outputs. I suggest returning attention weights as well.
## Motivation
For all purposes -- demos, tutorials, and practica…
-
Hi, when I ran Captum Insights example python -m captum.insights.example and even tried out the CIFAR_TorchVision_Captum_Insights.ipynb in jupyter notebook. I am able to render the visualizer and it …
-
## 🚀 Feature Request
The following is a non-exhaustive list of gradient-based feature attribution methods that could be added to the library:
Method name
Source
In Captum
Code …
-
Using https://github.com/pytorch/captum, we need to find some good benchmarks that show the reason for a verdict/diagnosis.
Our goal is provide a verifiable second opinion on a diagnosis.