Open hkim716 opened 3 years ago
Yes, indeed. This is exactly what is happening. One typically also applies a final linear or non-linear transformation on top of the pooled features, e.g., via an MLP. In order to make use of mini-batching capabilities, you need to swap out the torch.mean
call with our torch_geometric.global_mean_pool
functionality.
Hi Matt,
I have question about designing a simple pooling layer from some
GCNConv
layers. I would like to create a supervised regression model. I have many number of graphs in the datasets, and each graph hasx=[100,1]
andy=[2,1]
as label.From my understanding, when I use
in_channels=100
andout_channels = 2
,torch.mean(x, dim=1)
will perform like a pooling layer that reduce the input dimension from100
to2
. Is that right? Could it be considered asMeanPooling
?