google-deepmind / graph_nets

Build Graph Nets in Tensorflow
https://arxiv.org/abs/1806.01261
Apache License 2.0
5.34k stars 783 forks source link

Use a CNN instead of an MLP in EncodeProcessDecode class #21

Closed ferreirafabio closed 5 years ago

ferreirafabio commented 5 years ago

In order to process my node features of shape [batch_size, feature_length] with 1D convolutions, I want to use as the node_model_fn in the MLPGraphIndependent (encoder part) a function make_cnn_model(), replacing the previously used function make_mlp_model() from the examples. Because 1D convs typically operate over a depth channel, I added an additional channel dimension to the node features, yielding the shape [?, feature_length, 1].

In the sonnet library there does exist a ConvNet2D class but unfortunately no implementation for ConvNet1D is available. I have tried to build my own ConvNet1D module with snt.Sequential and a stack of multiple snt.python.modules.conv.Conv1D but failed with the following error:

ValueError: Shape must be rank 2 but is rank 3 for 'EncodeProcessDecode_1/MLPGraphNetwork/graph_network/edge_block/concat' (op: 'ConcatV2') with input shapes: [?,64], [?,268812,128], [?,268812,128], [?,256], []

I assume I must reduce the initially expanded dimension again but can't infer the correct procedure from neither the sonnet nor the graph_nets documentation. I believe the question is better raised here than in the sonnet repo since it seems to be an issue with the subsequent processing of the core or decoder in the EncoderProcessDecode implementation.

alvarosg commented 5 years ago

This seems not to be failing in GraphIndependent, but in the EdgeBlock used within a GraphNetwork, for which some of the models are outputting 2d tensors, and some others 3d tensors.

The easiest way to have everything be 2d tensors is to wrap your 1d convnet module and use a tf.expand_dims just before, and a tf.squeeze just after (alternatively you could also to reshapes if you want to work with more than one channel).

ferreirafabio commented 5 years ago

Appreciate it!