Closed nfelnlp closed 3 years ago
Came up with a way to at least make the attribute function work for GuidedBackprop (and would work for Input x Gradient as well): https://github.com/nfelnlp/thermostat/blob/a8d74650f8ba4f4b0f6504cf8c06043e492d6ed3/src/thermostat/explainers/grad.py#L115
However, this meant:
inputs_embeds
parameter instead of the input_ids
parameterThis results in an attributions
shape of (1, 512, 768) which probably doesn't make sense at all.
After consulting with @rbtsbg, InputXGradient and GuidedBackprop will be substituted by LayerGradientXActivation
which works similar to LayerIntegratedGradients
and is easier to handle with just passing the base model's embedding layer to the explainer at creation time.
Stuck at this error while implementing InputXGradient. Tested on DistilBERT and RoBERTa.