Open mufeili opened 2 years ago
batch
is None in evaluation procedure, because I don't apply GraphDataLoader to evaluation. batch
is only related to LayerNorm at L29.s
after train_mask
, val_mask
is the number of them in WikiCS dataset is 20 and test_mask
is 1. But in other dataset is not like this, so I decide to change the mask to data.dataset_dir
you mean download the dataset to the default directory?
- The re-create PPIDataset at L45 is used for evaluation procedure. In training procedure, the train set and validation set is contacted. In order to separate three set, I have to re-create them.
Got it. Thanks.
I have also started working on PR for WikiCS and transform.
Sounds great.
Remove the dataset_dir you mean download the dataset to the default directory?
Yes, particularly if you add WikiCS as a built-in DGL dataset.
node_feat_names
to indicate the transform is applied to which feat. In pyg implementation, they apply the transform to all feat.For NormalizeFeatures, should I set the parameter node_feat_names to indicate the transform is applied to which feat. In pyg implementation, they apply the transform to all feat.
You can have two arguments to separately specify the ndata
and edata
feature name to normalize. If None, then all features will be normalized if applicable.
For NodeFeaturesMasking, should I handle the case that the ndata['feat'].shape is (num_nodes,)
I think you can assume the node features to be 2-dimensional for now.
README file
WikiCS
is not available in DGL, it will be great to open a PR for contributing a built-in dataset.4
is achieved, then likely we will no longer need thedataset_dir
argument.dataset
should beamazon_photos
rather thanAmazon Photos
.main.py
. The code block was commented out.transforms.py
data.py
s
aftertrain_mask
,val_mask
, but nottest_mask
?PPIDataset
at L45?main.py
data
tog
for clarity at L86.model.py
batch
is always not None for PPI + GraphSAGE_GCN, right? If so, perhaps there's no need to handle the case wherebatch
is None.