Closed josejimenezluna closed 4 years ago
Hi @josejimenezluna, thank you for finding the issue. It is a bug in the documentation. We decided to make baseline a mandatory field for GradientSHAP and let the user specify the distribution of baselines through that variable. If we default is to zero it might not be a very meaningful distribution.
Fixed with #256
https://github.com/pytorch/captum/blob/5231c2ec8c59f7ca48a7b421adb412ccf21c8364/captum/attr/_core/gradient_shap.py#L16-L35
According to the docs, the
baselines
parameter in theattribute
method ofGradientShap
is optional, and is replaced with a zero-filled Tensor as the same size as the input if not provided. However at the moment it's a required argument.