pytorch / captum

Model interpretability and understanding for PyTorch
https://captum.ai
BSD 3-Clause "New" or "Revised" License
4.86k stars 490 forks source link

TracIn: sample_wise_grads_per_batch: add_hooks(model) is not clearly documented #987

Open felixmeyjr opened 2 years ago

felixmeyjr commented 2 years ago

❓ Questions and Help

In the constructor of TracIn is the flag sample_wise_grads_per_batch. In the “main” method _compute_jacobian_wrt_params_with_sample_wise_trick for this trick is the comment, that the user must add_hooks(model) before calling this function. The documentation is not clear because this comment is only in the docstring of that mentioned method. I assume that I need to use add_hooks(model) of autograd-hacks before constructing TracInCP? What happens if I don't add the hooks? Thank

99warriors commented 2 years ago

Hi @felixmeyjr Thank you for pointing this out - you do not need to call add_hooks - that is effectively done for you in _compute_jacobian_wrt_params_with_sample_wise_trick - we apologize for its incorrect documentation and will correct it. In general you do not need to worry about hooks when using TracIn.