Closed XVilka closed 3 years ago
AFAIK there's nothing fundamental to be done here (it all just boils down to matmul as usual) but we can probably make some nice convenient layers for this, and perhaps even make sure that e.g. LightGraphs can be fed in to Flux models.
Would be great to have someone interested in this outline a game plan.
Some inspiration for this:
https://github.com/rusty1s/pytorch_geometric https://github.com/deepmind/graph_nets https://blog.acolyer.org/2018/09/19/relational-inductive-biases-deep-learning-and-graph-networks/
I think some of the more advanced stuff is more than just matmul.
I was just contemplating something along these lines as a potential Google Summer of Code project that I might be interested in doing.
Well, we are working on this at the moment. You can hack something pretty quickly with Mill.jl project. I have just tried yesterday, and had some working example in about an afternoon. Of course, knowledge about Mill.jl was a big help.
We plan to eventually release a library.
So, I've been delving into the literature a bit and it seems like this could be a reasonable project for the summer. As far as deliverables for GSoC, I'm thinking of something along the lines of the following:
Feel free to share thoughts/feedback/advice. Also, if anyone who is qualified to mentor for GSoC is interested in a project like this, it would be great to connect.
So, I've been delving into the literature a bit and it seems like this could be a reasonable project for the summer. As far as deliverables for GSoC, I'm thinking of something along the lines of the following:
- Minimum viable product: a prototype GNN that could live in the model zoo.
- Expected: implementation of additional layers/functions necessary to support GNNs within the Flux framework along with the capability to read from and write to graphs with Julia's LightGraphs/MetaGraphs packages.
- Reach: Expected + implementations of notable GNNs for the model zoo.
Feel free to share thoughts/feedback/advice. Also, if anyone who is qualified to mentor for GSoC is interested in a project like this, it would be great to connect.
I'm thinking about this too. But it may be too straightforward if we just add a simple GNN layer. I think we need something more. What do you think about it?
If you look at our project Mill.jl on github, it implements all the layers you need for gnn. I have implemented GNN with Mill.jl and LightGraphs.jl in 4 hours when I had a flu.
@pevnak how does Mill compare to libraries like https://github.com/rusty1s/pytorch_geometric https://github.com/deepmind/graph_nets ?
It seems more general? In that graphs can be modeled as nested bags?
If you look at our project Mill.jl on github, it implements all the layers you need for gnn. I have implemented GNN with Mill.jl and LightGraphs.jl in 4 hours when I had a flu.
That's cool, but I'm trying to figure out a reasonable contribution (for a cs undergrad to accomplish over the summer) that would be to Flux itself (as they are participating in Google Summer of Code). I'm not sure how the Flux folks feel about introducing external dependencies into their codebase, but, if they are open to it, Mill.jl could be useful.
I'm thinking about this too. But it may be too straightforward if we just add a simple GNN layer. I think we need something more. What do you think about it?
From what I've gathered, it seems like it is a bit more involved than just adding a layer. I think there needs to be a good amount of additional code to accomplish what has to happen under the hood. Take a look at how the libraries that @datnamer mentioned work.
I am on holidays, so I apologize for not being responsive. Our Mill library was originally designed for multiple-instance learning, their nesting and taking Cartesian products, such that we can model data stored in JSONs and similar tree-like formats.
As such, the critical part of library are aggregation operations like mean and max, where you can perform them only over subset of columns (data). This type of operation is used in GraphNN, which allows us to use Mill to efficiently handle graph-like data.
If you port take these operations and put the to Flux, you will have the GNN layer, but the trick is really to make it efficient, which Mill library allows. I am not sure if support of graph nns should be part of Flux, as I thought that Flux should be mainly AD engine.
I'm interested in GNN on Flux.jl, too. I agree with bridging LightGraph.jl and Flux.jl Where can I contribute to?
This is also interesting and quite related: https://github.com/Accenture/AmpliGraph
Also this paper is a good show case on using GNNs https://arxiv.org/abs/1904.12787
You may want to have a look at Deep Graph Library. This is a framework running on top of PyTorch or MXNet that allows implementing Graph Neural Networks using a message passing approach.
For instance, a single layer of graph convolution is implemented as follows (from https://github.com/dmlc/dgl):
import dgl.function as fn
import torch.nn as nn
import torch.nn.functional as F
from dgl import DGLGraph
msg_func = fn.copy_src(src='h', out='m')
reduce_func = fn.sum(msg='m', out='h')
class GCNLayer(nn.Module):
def __init__(self, in_feats, out_feats):
super(GCNLayer, self).__init__()
self.linear = nn.Linear(in_feats, out_feats)
def apply(self, nodes):
return {'h': F.relu(self.linear(nodes.data['h']))}
def forward(self, g, feature):
g.ndata['h'] = feature
g.update_all(msg_func, reduce_func)
g.apply_nodes(func=self.apply)
return g.ndata.pop('h')
This approach is really clean and intuitive. It would be really nice to have something similar in Julia.
Here is the paper they published with the library: https://rlgm.github.io/papers/49.pdf
Unfortunately, life circumstances were such that I wasn't able to get a proposal together in time for the GSoC deadline this year, but I may try to put together something similar for next summer if this still seems worthwhile/needs doing. In the meantime though, if anyone wants to take point on this, feel free—I probably won't be able to dedicate too much time to this between now and then.
Hey, guys! I started the package for GNN/geometric deep learning for Flux. It's still not mature, and I will keep optimizing this extension. Any suggestions or pull requests are welcome. [ANN] GeometricFlux.jl - Geometric Deep Learning for Flux
The repository itself is located at https://github.com/yuehhua/GeometricFlux.jl. So anyone can help with a pull requests/suggestions/etc.
Since we have GeometricFlux we can close this
One interesting paper recently was published on the subject: "Geometric Deep Learning Grids, Groups, Graphs, Geodesics, and Gauges" by Michael M. Bronstein, Joan Bruna, Taco Cohen, Petar Veličković
@XVilka It is welcome to file an issue to FluxML/GeometricFlux.jl for more discussion.
It is fairly new method but with a big application potential.
"Graph neural networks (GNNs) are connectionist models that capture the dependence of graphs via message passing between the nodes of graphs. Unlike standard neural networks, graph neural networks retain a state that can represent information from its neighborhood with arbitrary depth. Although the primitive GNNs have been found difficult to train for a fixed point, recent advances in network architectures, optimization techniques, and parallel computation have enabled successful learning with them. In recent years, systems based on graph convolutional network (GCN) and gated graph neural network (GGNN) have demonstrated ground-breaking performance on many tasks mentioned above. In this survey, we provide a detailed review over existing graph neural network models, systematically categorize the applications, and propose four open problems for future research. "
See arXiV:1812.08434 and corresponding PDF.
UPDATE: GeometricFlux.jl is under construction already and being prepared for Flux integration.