rosinality / style-based-gan-pytorch

Implementation A Style-Based Generator Architecture for Generative Adversarial Networks in PyTorch
Other
1.1k stars 232 forks source link

what is the important of the below lines? #11

Open nile649 opened 5 years ago

nile649 commented 5 years ago
        if mixing and random.random() < 0.9:
            gen_in11, gen_in12, gen_in21, gen_in22 = torch.randn(4, b_size, code_size, device='cuda').chunk(4, 0)
            gen_in1 = [gen_in11.squeeze(0), gen_in12.squeeze(0)]
            gen_in2 = [gen_in21.squeeze(0), gen_in22.squeeze(0)]

        else:
            gen_in1, gen_in2 = torch.randn(2, b_size, code_size, device='cuda').chunk(2, 0)
            gen_in1 = gen_in1.squeeze(0)
            gen_in2 = gen_in2.squeeze(0)
rosinality commented 5 years ago

It is mixing regularization in the paper. Mixing regularization makes model robust when more than 1 latent code used for generation. (style mixing.)

nile649 commented 5 years ago

Thankyou

nile649 commented 5 years ago

could you also explain what is inject_index and crossover do.

    if len(style) < 2:
        inject_index = [len(self.progression) + 1]

    else:
        inject_index = random.sample(list(range(step)), len(style) - 1)

    crossover = 0

    for i, (conv, to_rgb) in enumerate(zip(self.progression, self.to_rgb)):
        if mixing_range == (-1, -1):
            if crossover < len(inject_index) and i > inject_index[crossover]:
                crossover = min(crossover + 1, len(style))

            style_step = style[crossover]
rosinality commented 5 years ago

To do mixing regularization, you should choice some layers (index), and use secondary latent codes after that layer. inject_index and crossover is for implement this.

nile649 commented 5 years ago

style_step = style[crossover] gives learned style from latent code, rather layer where one need to mix.

I am sorry, I am very much stuck on the same line.

rosinality commented 5 years ago

Yes, it works by get style codes using index of layers and inject it to that layer.