Thanks for the beautiful work.
But I run into 'tensor size'/'out of bound' error for link prediction task on all datasets, while all is okay with node classification.
This message showed under your example input (say, the disease_lp one):
Traceback (most recent call last):
File "train.py", line 163, in
train(args)
File "train.py", line 96, in train
embeddings = model.encode(data['features'], data['adj_train_norm'])
File "/home/server/code/embedding_methods/GIL/GIL/models/base_models.py", line 34, in encode
h = self.encoder.encode(x, adj)
File "/home/server/code/embedding_methods/GIL/GIL/models/encoders.py", line 341, in encode
return super(GIL, self).encode((x_hyp, x), adj)
File "/home/server/code/embeddingmethods/GIL/GIL/models/encoders.py", line 22, in encode
output, = self.layers.forward(input)
File "/home/server/miniconda3/envs/hy-torch/lib/python3.6/site-packages/torch/nn/modules/container.py", line 139, in forward
input = module(input)
File "/home/server/miniconda3/envs/hy-torch/lib/python3.6/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
return forward_call(*input, *kwargs)
File "/home/server/code/embedding_methods/GIL/GIL/layers/hyp_layers.py", line 217, in forward
x = self.conv(input_h)
File "/home/server/miniconda3/envs/hy-torch/lib/python3.6/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
return forward_call(input, **kwargs)
File "/home/server/code/embedding_methods/GIL/GIL/layers/hyp_layers.py", line 286, in forward
out = self.propagate(edge_index, x=log_x, num_nodes=x.size(0), original_x=x)
File "/home/server/miniconda3/envs/hy-torch/lib/python3.6/site-packages/torch_geometric/nn/conv/message_passing.py", line 262, in propagate
kwargs)
File "/home/server/miniconda3/envs/hy-torch/lib/python3.6/site-packages/torch_geometric/nn/conv/message_passing.py", line 170, in collect
self.set_size(size, dim, data)
File "/home/server/miniconda3/envs/hy-torch/lib/python3.6/site-packages/torch_geometric/nn/conv/message_passing.py", line 135, in set_size
(f'Encountered tensor with size {src.size(self.node_dim)} in '
ValueError: Encountered tensor with size 8 in dimension -2, but expected size 2665.
When it's the airport dataset for lp, the error is
RuntimeError: INDICES element is out of DATA bounds, id=1975 axis_dim=8
Do you have any idea what's the problem?
Many thanks!
Thanks for the beautiful work. But I run into 'tensor size'/'out of bound' error for link prediction task on all datasets, while all is okay with node classification. This message showed under your example input (say, the disease_lp one):
When it's the airport dataset for lp, the error is
Do you have any idea what's the problem? Many thanks!