Open sangho-vision opened 5 years ago
I think it is wrong. No need to add this.
How do you think?
@sh0416
Yes, it is wrong.
I want to avoid numerical instability.
Instead of minus sign, you need to subtract the max value of tensor like logsumexp.
Hey @sh0416 , sorry but what do you mean by "subtract the max value of tensor like logsumexp" Thanks!
Hi @sh0416 , could you give a modification of the code?
I read the code and concluded that just removing the minus sign will just work.
Hi! @sh0416 I am new in deep-learning. I find that there will be inf during torch.exp(self.leakyrelu(self.a.mm(edge_h).squeeze())) (without the minus sign). Do you think the minus necessary for for numerical stability?Or we should switch to another function?
Hello, In your implementation of SpGAT,
there is this line: edge_e = torch.exp(-self.leakyrelu(self.a.mm(edge_h).squeeze()))
However, I cannot understand why you added the minus sign in front of the leak relu operation.
Is that right?