cc-ai / climategan

Code and pre-trained model for the algorithm generating visualisations of 3 climate change related events: floods, wildfires and smog.
https://thisclimatedoesnotexist.com
GNU General Public License v3.0
75 stars 18 forks source link

About bit conditioning #47

Closed melisandeteng closed 4 years ago

melisandeteng commented 4 years ago

So bit conditioning is "a way to share weights. Instead of having domain-specific weights, you share weights and "choose" a path according to that domain signal, encoded in the "bit" "

I'm not sure I understand why cond_nc is initialized twice in the SpadeTranslationDecoder, but more importantly, why it would be initialized to 2 when bit-conditioning and 0 else ?

class SpadeTranslationDecoder(SpadeDecoder):
    def __init__(self, latent_shape, opts):
        self.bit = None
        self.use_bit_conditioning = opts.gen.t.use_bit_conditioning
        cond_nc = 4  # 4 domains => 4-channel bitmap
        cond_nc = 2 if self.use_bit_conditioning else 0  # 2 domains => 2-channel bitmap
vict0rsch commented 4 years ago

it's initialized twice because I was unsure what the best way to do it was.

if we forget about real/sim then this thing is only supposed to add / remove water encoded in a pair of feature maps that look like this:

flood_features = torch.zeros(batch, 2, z_height, z_width)
flood_features[:, 0, :, :] = 1.

non_flood_features = torch.zeros(batch, 2, z_height, z_width)
non_flood_features[:, 1, :, :] = 1.

this is done in get_4D_bit depending on the model's current bit which is flipped when accessing SpadeTranslationDict(...)["f"] or SpadeTranslationDict(...)["n"]

vict0rsch commented 4 years ago

obviously we could have only 1 binary channel with 0 or 1 but 1 thought encoding it in a 1-hot way would make more sense especially if we want to use a single decoder for flood/non_flood and real/sim

vict0rsch commented 4 years ago

(the trailing 4 is to remind us of the current uncertain choice of not sharing weights for adaptation)