Closed arnavc1712 closed 3 years ago
Hi Kyle, Thanks for the revert. Yes but that would require training the model again. Would it be possible to share model weights after training on activity net?
It was already uploaded in the same Google drive link that was mentioned in the issue! Thank you.
Hi Kyle, Thank you for the quick reply. I tried loading the weights into the model and also added the 'tfilter' layer as described in the paper. However the numbers are extremely low. Where could I be going wrong? 0.50 0.55 0.60 0.65 0.70 0.75 0.80 0.85 0.90 0.95 val result at 0: 0.01 || 0.87 | 0.06, 0.01, 0.01, 0.01, 0.01, 0.01, 0.01, 0.01, 0.01, 0.00
I think your tfilter layer is not defined properly. How did you add the layer? Please provide me the corresponding line of your code.
Hi Kyle,
I used this in the init() function - self.tfilter = nn.Conv1d(num_class, num_class, kernel_size=13, stride=1, padding=12, dilation=2, bias=False, groups=num_class)
And this in the forward() function tcam = F.relu(self.tfilter(tcam.permute([0,2,1]))).permute([0,2,1])
Please try without relu activation, as we just applied tfilter without it.
Hi Kyle, Thanks very much for sharing the code. Could you please help me with where to add the filter?
Hi @memoryjing, You can add it at the end of the forward function.
... tcam = (cls_x_r+cls_x_ratself.omega) self.mul_r + (cls_x_f+cls_x_fatself.omega) self.mul_f tcam = self.tfilter(tcam.permute(0,2,1)).permute(0,2,1) return ...
Hi, Would it be possible to share model weights for the activity net training?