Closed ik362 closed 2 years ago
Tagging same question on pytorch forums. https://discuss.pytorch.org/t/implementing-captum-with-pytorch-lightning/129292
hi @ik362 , sorry for my late reply.
I believe pytorch-lightning
has nothing to do here. It will work as long as your model have a forward
-like interface to pass into Captum.
The issue is caused by the 2nd argument in the following line.
self.cam = LayerGradCam(self.forward, 'model.5')
What is the string model.5
? named module? is it defined in your blocks, e.g., linearBlock
?
Anyway, the 2nd argument layer
should be the module itself, not a name. You can refer to our documentation for details https://github.com/pytorch/captum/blob/4faf1ea49fbff90af92b759c1f763dda1d8be705/captum/attr/_core/layer/grad_cam.py#L64-L67
❓ Questions and Help
Hi there, I am a new user to and I am trying to use LayerGradCam in captum to interpret a particular layer in my model.
Part of the problem/complication seems to be that my model and forward method are defined in a pytorch-lightning module.
My pytorch-lightning module is:
However, when I run the test step I am getting the error:
I have two questions then:
Thanks for your help!