Closed backyes closed 4 years ago
You can found sampling for GCN at https://github.com/dmlc/dgl/tree/master/examples/pytorch/sampling
We are also working on mini-batch training for GCMC right now.
Sampling-based GraphSAGE example has been live: https://github.com/dmlc/dgl/tree/master/examples/pytorch/graphsage
Hi all, I would like to ask that... Is the implementation of sampling-based GraphSAGE (train_sampling.py) a transductive learning? Since I found when it aggregated neighbors, it also count the nodes that belong to testing set. Thanks.
Hi,
Can graphSAGE/GCMC support mini-batch training / distributed training ? Any documentations?
graphSAGE: https://github.com/dmlc/dgl/blob/master/examples/pytorch/graphsage/graphsage.py
GCMC: https://github.com/dmlc/dgl/blob/master/examples/mxnet/gcmc/train.py
The mini-batch training was seen in released codes of paper's authors, but the example code in dgl/examples does not show explict mini-batch training style. How I use mini-batch training with dgl? Also wish some best practices and performance evaluations for large scale graph training except https://docs.dgl.ai/en/latest/tutorials/models/5_giant_graph/2_giant.html ?
Best Wishes