A Survey on Deep Graph Generation: Methods and Applications [PMLR22]
Abst
formulation of the problem of deep graph generation, discussion of its difference with several related graph learning tasks, 3 categories of the methods, applications, challenges
Intro
model
GraphRNN: generates nodes and edges step by step
GraphVAE
MoFlow: invertible mapping between the input graph and the latent space
MolGAN: discriminator ensures the properties of the generated graphs
GDSS: Diffusion base
Problem
link prediction
graph structure learning
aims to improve the noisy or incomplete graphs
A Survey on Graph Structure Learning: Progress and Opportunities
Graph Structure Learning for Robust Graph Neural Networks
Generative sampling
generate subsets of nodes and edges from a large graph
Set generation
generate set objects
similar to graph generation but, typically doesn't consider edge features
Framework
Encoder
input: graph
output: latent vector
encoder function outputs the parameters of a stochastic distribution following a prior distribution
Sampler
Random sampling (or distribution learning)
Controllable sampling: aims to generate new graphs with desired properties
usually depends on different types of models and requires an additional optimization term
Decoder
compared to the Encoder, the graph generation process is more complicated due to the discrete, non-Euclidean nature of graph objects
Sequential generation or One-shot generation
Models
Auto-Regressive models
given the current subgraph, determines the next step action
requires a pre-specified ordering $\pi$ of nodes in the graph
VAEs
reconstruction loss + constraints(latent space -> prior distribution)
GNNs (like GCN, and GAT) are typically used for the encoder/decoder
Normalizing Flows
estimate the density of the graph directly with an invertible and deterministic mapping(f) between the latent variables and the graphs
GANs
Permutation Invariance
Graphs are inherently invariant with respect to permutation
some works achieve this goal while auto-regressive models(e.g., GraphRNN) often require node ordering(permutation variance)
generator $f_G$, discriminator $f_D$
Diffusion models
Sampling Strategies
random sampling
Controllable sampling
Disentangled: factories the latent vector z with each dimension z_n focusing on one property
conditional sampling: concatenate z and c which explicitly controls the property of generated graphs
A Survey on Deep Graph Generation: Methods and Applications [PMLR22]
Abst
Intro
model
Problem
link prediction
graph structure learning
Generative sampling
Set generation
Framework
Encoder
Sampler
Decoder
Models
Auto-Regressive models
VAEs
Normalizing Flows
GANs
Permutation Invariance
Graphs are inherently invariant with respect to permutation
some works achieve this goal while auto-regressive models(e.g., GraphRNN) often require node ordering(permutation variance)
generator $f_G$, discriminator $f_D$
Diffusion models
Sampling Strategies
random sampling
Controllable sampling