Open t170815518 opened 4 years ago
I use Debugger in an attempt to find out where goes wrong: Before e1 and rel are embedded, they are both tensors in int64 with the shape of torch.Size([128, 1]).
e1 can be embedded as normal, converting into torch.float32 and torch.Size([128, 1, 10, 20]). However, after rel passed the embedding layer of emb_rel, Debugger shows all tenors as Unable to get repr for <class 'torch.Tensor'>.
It's because that, take dataset FB15k-237 for example,relation2id.txt
includes the reverse relationship, and build_vocabs()
in main.py
deals with it redundantly, leading to a wrong id mapping that cannot match with the embedding layer's look-up table size.
See https://github.com/eXascaleInfolab/ActiveLink/issues/5#issue-683591605 also.
Hi, I am trying to run the code on GPU, however the RuntimeError 710 occurs: