Closed kidlestar closed 4 years ago
Hi, thank you for pointing this out. This phenomenon is caused by the following line https://github.com/yzhangcs/crfpar/blob/8abb95f177e5cbf4d7ebc494bcaf9ca15af3e3da/parser/utils/alg.py#L165 which is used for disabling multi-root. If you delete the line, the result is the same as TorchStruct.
>>> s = torch.zeros(1, 3, 3).requires_grad_()
>>> mask= torch.tensor([[0, 1, 1]]).bool()
>>> lens = torch.tensor([[2]])
>>> _, s_c = inside(s, mask)
>>> s_c[0].gather(0, lens).sum()
tensor(1.0986, grad_fn=<SumBackward0>)
Thanks for the response on mail.
It is really the multi_root thing that makes the difference. (I have done another test to compare the partition value, they are exactly the same when comment line )
Many thanks for your help!
Test the result of inside function, the inside_ng is the inside function removing all the register_hook
import torch from torch_struct import DependencyCRF from alg import inside_ng
score = torch.zeros(1,3,3) score1 = torch.zeros(1,2,2) #equivalent score used by torch struct
mask = torch.tensor([False, True, True]).unsqueeze(0) lens = torch.tensor([2]).long()
partition by inside
s_i, s_c = inside_ng(score, mask) partition = s_c[0].gather(0, lens.unsqueeze(0)).sum() print(partition)
partition by torch struct
deptree = DependencyCRF(score1, lens) print(deptree.partition)
The result is 0.6931(by inside) and 1.0986
By calculation, the true logZ value should be ln(3)=1.0986, but not ln(2)=0.6931