Shen-Lab / GraphCL

[NeurIPS 2020] "Graph Contrastive Learning with Augmentations" by Yuning You, Tianlong Chen, Yongduo Sui, Ting Chen, Zhangyang Wang, Yang Shen
MIT License
548 stars 103 forks source link

A question about Semisupervised_TU in pre_training #61

Open ytpjh opened 2 years ago

ytpjh commented 2 years ago

main.py 320 run_exp_benchmark()

main.py 278 run_exp_benchmark run_exp_lib(create_n_filter_triples(datasets, feat_strs, nets,

main.py 183 run_exp_lib cross_validation_with_val_set(

train_eval.py 115 cross_validation_with_val_set trainloss, = train(

train_eval.py 234 train out1 = model.forward_cl(data1)

res_gcn.py 173 forward_cl return self.forward_BNConvReLU_cl(x, edge_index, batch, xg)

res_gcn.py 180 forward_BNConvReLUcl x = F.relu(conv(x_, edge_index))

module.py 1186 _call_impl return forward_call(*input, **kwargs)

gcn_conv.py 103 forward edge_index, norm = GCNConv.norm(

gcn_conv.py 90 norm deg = scatter_add(edge_weight, row, dim=0, dim_size=num_nodes)

scatter.py 27 scatter_add return scatter_sum(src, index, dim, out, dim_size)

scatter.py 9 scatter_sum index = broadcast(index, src, dim)

utils.py 12 broadcast src = src.expand(other.size())

RuntimeError: expand(torch.cuda.LongTensor{[2, 12108]}, size=[12108]): the number of sizes provided (1) must be greater or equal to the number of dimensions in the tensor (2)

There is a error in scatter_add() function,this error remain that the size of src tensor is not euqal to the size of index tensor.I have tried many times but it is still not work.Could you give a solution about this question? Thanks

yyou1996 commented 2 years ago

Hi @ytpjh,

The most possible reason for this issue would be the package version. Please follow https://github.com/Shen-Lab/GraphCL/tree/master/semisupervised_TU#dependencies to set up the environment.

If want to directly deal with this error, I would suggest starting from res_gcn.py 180 forward_BNConvReLU_cl x_ = F.relu(conv(x_, edge_index))

Expecially, print out x_.shape, edge_index.max(). It seems that edge_index.max() > x_.shape[0] which means you are trying to do message passing on non-existing nodes.