snap-stanford / GraphGym

Platform for designing and evaluating Graph Neural Networks (GNN)
Other
1.69k stars 185 forks source link

Error using TU_IMDB dataset #36

Open psanch21 opened 2 years ago

psanch21 commented 2 years ago

Hi,

I'm getting the following error when I try to use the TU_IMDB dataset:

Traceback (most recent call last):
  File "/Users/psanchez/Documents/GitHub/transformer_message_passing/run/main.py", line 42, in <module>
    datasets = create_dataset()
  File "/Users/psanchez/miniconda3/envs/transformer_mp/lib/python3.9/site-packages/graphgym-0.3.1-py3.9.egg/graphgym/loader.py", line 197, in create_dataset
    graphs = load_dataset()
  File "/Users/psanchez/miniconda3/envs/transformer_mp/lib/python3.9/site-packages/graphgym-0.3.1-py3.9.egg/graphgym/loader.py", line 111, in load_dataset
    graphs = load_pyg(name, dataset_dir)
  File "/Users/psanchez/miniconda3/envs/transformer_mp/lib/python3.9/site-packages/graphgym-0.3.1-py3.9.egg/graphgym/loader.py", line 74, in load_pyg
    graphs = GraphDataset.pyg_to_graphs(dataset_raw)
  File "/Users/psanchez/miniconda3/envs/transformer_mp/lib/python3.9/site-packages/deepsnap/dataset.py", line 1276, in pyg_to_graphs
    return [
  File "/Users/psanchez/miniconda3/envs/transformer_mp/lib/python3.9/site-packages/deepsnap/dataset.py", line 1277, in <listcomp>
    Graph.pyg_to_graph(
  File "/Users/psanchez/miniconda3/envs/transformer_mp/lib/python3.9/site-packages/deepsnap/graph.py", line 2027, in pyg_to_graph
    Graph.add_node_attr(G, key, value)
  File "/Users/psanchez/miniconda3/envs/transformer_mp/lib/python3.9/site-packages/deepsnap/graph.py", line 1911, in add_node_attr
    attr_dict = dict(zip(node_list, node_attr))
TypeError: 'int' object is not iterable

I'm running the main.py with the following dataset configuration file:

out_dir: results
dataset:
  format: PyG
  name: TU_IMDB
  task: graph
  task_type: classification
  transductive: False
  split: [0.8, 0.2]
  augment_feature: []
  augment_feature_dims: [10]
  augment_feature_repr: position
  augment_label: ''
  augment_label_dims: 5
  transform: none
train:
  batch_size: 32
  eval_period: 20
  ckpt_period: 100
model:
  type: gnn
  loss_fun: cross_entropy
  edge_decoding: dot
  graph_pooling: add
gnn:
  layers_pre_mp: 1
  layers_mp: 2
  layers_post_mp: 1
...

Any idea why this might be happening?

Thanks a lot in advance.