hwalsuklee / tensorflow-generative-model-collections

Collection of generative models in Tensorflow
Apache License 2.0
3.91k stars 857 forks source link

Continuous latent loss possibly wrong in InfoGAN #21

Closed abdulfatir closed 6 years ago

abdulfatir commented 6 years ago

In InfoGAN.py line 146, the squared loss is being optimized as follows.

cont_code_est = code_fake[:, self.len_discrete_code:]
cont_code_tg = self.y[:, self.len_discrete_code:]
q_cont_loss = tf.reduce_mean(tf.reduce_sum(tf.square(cont_code_tg - cont_code_est), axis=1))

The code_fake vector is in [0,1] as it comes from a softmax non-linearity. The actual latent code being sent, however, is in [-1,1]. (See line 234).

batch_codes = np.concatenate((batch_labels, np.random.uniform(-1, 1, size=(self.batch_size, 2))),
                                             axis=1)

Am I missing something?

hwalsuklee commented 6 years ago

Oh..

You are right. It must be code_logit_fake.

Thanks for your contribution.

I've changed code.