eriklindernoren / PyTorch-GAN

PyTorch implementations of Generative Adversarial Networks.
MIT License
16.54k stars 4.09k forks source link

About the embedding in CGAN #56

Open ShijianXu opened 5 years ago

ShijianXu commented 5 years ago

Hi, I have a question about the cgan implementation. In your code, you use nn.embedding to embed the prior labels. The problem is, when the learnable weights are not specified, the vocabulary will be randomly initialized.

In both generator and discriminator, you use two different nn.embedding, and they are initialized differently. However, when we generate a fake image, we use one embedding, but when we use discriminator to distinguish the fake image, we use another embedding. Will this have effect on the final performance?

I am not very familiar with GAN. But I just think this is strange. It's true that we still use the same labels, but the actual embeddings are different. I think using the same embedding for the discriminator and generator will be more reasonable?

zhouqunbing commented 1 year ago

I'm also a greenhand,if we set the random seed, whether the nn.Embeddding in generator and discriminator is same?