snap-stanford / ogb

Benchmark datasets, data loaders, and evaluators for graph machine learning
https://ogb.stanford.edu
MIT License
1.93k stars 397 forks source link

"embeddings.pt" not created/saved after running "node2vec.py" #110

Closed mechantrix closed 3 years ago

mechantrix commented 3 years ago

Dear Matthias,

I am having a problem when running your program, altough the ogbn_proteins dataset is downloaded, I receive an error(bellow) right at the end and the embeddings.pt file doesn't seem to be created. I am using Ubunto, Pytorch 1.7.1, and the corresponding PyG version. Do you have any thoughts? Thank you very much for your time!

~$ python3 node2vec.py /home/asouza/anaconda3/lib/python3.6/site-packages/numba/decorators.py:146: RuntimeWarning: Caching is not available when the 'parallel' target is in use. Caching is now being disabled to allow execution to continue. warnings.warn(msg, RuntimeWarning) Downloading http://snap.stanford.edu/ogb/data/nodeproppred/proteins.zip Downloaded 0.21 GB: 100%|█████████████████████████████████████████████████████████████| 216/216 [00:48<00:00, 4.45it/s] Extracting dataset/proteins.zip Processing... Loading necessary files... This might take a while. Processing graphs... 100%|█████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:03<00:00, 3.48s/it] Converting graphs into PyG objects... 100%|███████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 7169.75it/s] Saving... Done! Traceback (most recent call last): File "node2vec.py", line 58, in main() File "node2vec.py", line 38, in main optimizer = torch.optim.SparseAdam(model.parameters(), lr=args.lr) File "/home/asouza/anaconda3/lib/python3.6/site-packages/torch/optim/sparse_adam.py", line 49, in init super(SparseAdam, self).init(params, defaults) File "/home/asouza/anaconda3/lib/python3.6/site-packages/torch/optim/optimizer.py", line 47, in init raise ValueError("optimizer got an empty parameter list") ValueError: optimizer got an empty parameter list

rusty1s commented 3 years ago

Should be fixed in master. This is caused by a bug introduced in PyTorch 1.7.0, where

optimizer = torch.optim.SparseAdam(model.parameters(), lr=args.lr)

needs to be replaced with

optimizer = torch.optim.SparseAdam(list(model.parameters()), lr=args.lr)
mechantrix commented 3 years ago

Thank you so much!