Closed melisandeteng closed 4 years ago
it's initialized twice because I was unsure what the best way to do it was.
if we forget about real/sim then this thing is only supposed to add / remove water encoded in a pair of feature maps that look like this:
flood_features = torch.zeros(batch, 2, z_height, z_width)
flood_features[:, 0, :, :] = 1.
non_flood_features = torch.zeros(batch, 2, z_height, z_width)
non_flood_features[:, 1, :, :] = 1.
this is done in get_4D_bit
depending on the model's current bit
which is flipped when accessing SpadeTranslationDict(...)["f"]
or SpadeTranslationDict(...)["n"]
obviously we could have only 1 binary channel with 0
or 1
but 1 thought encoding it in a 1-hot way would make more sense especially if we want to use a single decoder for flood/non_flood
and real/sim
(the trailing 4
is to remind us of the current uncertain choice of not sharing weights for adaptation)
So bit conditioning is "a way to share weights. Instead of having domain-specific weights, you share weights and "choose" a path according to that domain signal, encoded in the "bit" "
I'm not sure I understand why
cond_nc
is initialized twice in the SpadeTranslationDecoder, but more importantly, why it would be initialized to 2 when bit-conditioning and 0 else ?