-
Hi
Thanks for providing this clean example of GCN in pytorch. I am new to graph machine learnig so I would like to ask is this work fit the following scenario (See the figure and text below)?
Gi…
-
[This paper](http://www.machinelearning.org/archive/icml2009/papers/281.pdf) makes it easier to train previous algorithms. The paper doesn't discuss directed graphs explicitly but says it can be mappe…
-
Given the limited memory and bandwidth in a browser, sparse matrices could speed up transmission times and lower the memory usage.
Some example use cases: (1) If a model inefficiently creates many …
-
Hey There,
So I'm currently trying to use the equiformer for a protein/ligand prediction task. I've inherited the dataset from an earlier model I've made and it is in the PyG batching format of one l…
-
## 🚀 Feature
Support graph pooling operation on cliques.
## Motivation
Graph coarsening is common in tasks where we try to extract graph features from different scales. Usually, the input graph…
-
## 🐛 Bug
Sparse tensors can't be used in DataLoader running many workers
## To Reproduce
Steps to reproduce the behavior:
```python
import torch.utils.data as D
from scipy.sparse…
-
Hi, there,
Would you suggest opening the code about how to get the NELL23K and WD-singer datasets?
And, did you download your Wikipedia data directly from the official website “https://www.wikidata.…
-
### Description
Our current implementation of GAT does now allow the training and test graphs to be of different sizes, since the tensor dimensions are fixed during build. This is due to our sparse m…
kjun9 updated
4 years ago
-
I have tried running the evaluation script with the default number of samples i.e., 10000 5000 2500 at three scales and have also tried with lower numbers e.g., 1000 500 250 and even 100 50 25. I do n…
-
### Feature scope
core
### Describe your suggested feature
Currently there is sparse use of ["Dynamic Labels" ](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/graph-dynamic-l…