This repository contains code for the paper:
Making Sense of Dependence: Efficient Black-box Explanations Using Dependence Measure, Paul Novello, Thomas Fel, David Vigouroux, NeurIPS 2022.
The code is implemented and available for Tensorflow. A notebook is available: notebook Tensorflow.
Update 14/12/2022: The method is now available in Xplique, an awesome XAI library that has been used for all the experiments found in the paper. .
The present code uses the version 0.4.2 of Xplique. The namespaces of version 0.4.3 have been changed to include HSIC attribution method so we recommend to use HSIC directly from Xplique 0.4.3. Since HSIC is now part of Xplique, the present repository will not be maintained.
Nonetheless, this repository might still be useful to assess interactions, because this feature is not yet implemented in Xplique.
The notebook allows to run the code for:
The images are taken from ImageNet and the model used is a ResNet50.
Visualizations of object detection explanations with YOLOv4 for the first 40 images of COCO dataset are found in a zip
file at ./assets/coco_viz_select.zip.
The code for the metrics and the other attribution methods used in the paper come from the Xplique toolbox.
Paul Novello, DEEL team, Artificial and Natural Intelligence Toulouse Institute; IRT Saint-Exupéry
Thomas Fel, DEEL team, Artificial and Natural Intelligence Toulouse Institute; Carney Institute for Brain Science, Brown University
David Vigouroux, DEEL team, Artificial and Natural Intelligence Toulouse Institute; IRT Saint-Exupéry
@inproceedings{novello_hsic_attribution,
author = {Novello, Paul and Fel, Thomas and Vigouroux, David},
booktitle = {Advances in Neural Information Processing Systems},
title = {Making Sense of Dependence: Efficient Black-box Explanations Using Dependence Measure},
volume = {35},
year = {2022}
}