The BigGAN-Deep generator in BigGANdeep.py deviates from the official biggan-deep-512 tfhub module in several important ways. After reverse engineering the Tensorflow graph, the following fixes were ascertained:
Specnorm the conv layers: which_conv=layers.SNConv2d (was which_conv=nn.Conv2d)
Swap the label concatenation: z = torch.cat([z, y], 1) (was z = torch.cat([y, z], 1)
The final layer outputs 128 channels rather than 3, and the first 3 channels are returned via slicing
Additionally, this PR implements the following non-crucial changes. These can be dropped from the PR if you'd prefer:
Support 512x512 resolution for the BigGAN-Deep generator
Some minor reshuffling of the upscaling operation to match the Tensorflow graph
The Generator's default arguments now correspond to biggan-deep-512
The BigGAN-Deep generator in BigGANdeep.py deviates from the official biggan-deep-512 tfhub module in several important ways. After reverse engineering the Tensorflow graph, the following fixes were ascertained:
which_conv=layers.SNConv2d
(waswhich_conv=nn.Conv2d
)z = torch.cat([z, y], 1)
(wasz = torch.cat([y, z], 1)
Additionally, this PR implements the following non-crucial changes. These can be dropped from the PR if you'd prefer: