snap-stanford / ogb

Benchmark datasets, data loaders, and evaluators for graph machine learning
https://ogb.stanford.edu
MIT License
1.89k stars 397 forks source link

Add neighbor loader in papers100M dataset #427

Closed LukeLIN-web closed 1 year ago

LukeLIN-web commented 1 year ago

Use PyG neighbor loader to train paper100M big datasets. I am not sure whether it should place in pyg repo or this repo.

LukeLIN-web commented 1 year ago

But I meet troubles. The current problem is https://github.com/snap-stanford/ogb/issues/425#issuecomment-1500817508.

out torch.Size([1024, 172])
target torch.Size([1024, 1])
Traceback (most recent call last):
  File "neighborloader.py", line 98, in <module>
    main()
  File "neighborloader.py", line 91, in main
    loss = F.cross_entropy(out, target.squeeze(1))
  File "/opt/conda/lib/python3.8/site-packages/torch/nn/functional.py", line 2996, in cross_entropy
    return torch._C._nn.cross_entropy_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index, label_smoothing)
RuntimeError: "nll_loss_forward_reduce_cuda_kernel_2d_index" not implemented for 'Float'
LukeLIN-web commented 1 year ago

But I meet troubles. The current problem is #425 (comment).

out torch.Size([1024, 172])
target torch.Size([1024, 1])
Traceback (most recent call last):
  File "neighborloader.py", line 98, in <module>
    main()
  File "neighborloader.py", line 91, in main
    loss = F.cross_entropy(out, target.squeeze(1))
  File "/opt/conda/lib/python3.8/site-packages/torch/nn/functional.py", line 2996, in cross_entropy
    return torch._C._nn.cross_entropy_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index, label_smoothing)
RuntimeError: "nll_loss_forward_reduce_cuda_kernel_2d_index" not implemented for 'Float'

Fixed it using target.long()