AlexiaJM / RelativisticGAN

Code for replication of the paper "The relativistic discriminator: a key element missing from standard GAN"
720 stars 104 forks source link

Training curve, and hybrid models #12

Closed unilight closed 6 years ago

unilight commented 6 years ago

Hi, I have read your paper. It was a really interesting idea!

I've been trying to implement you paper in TensorFlow, and I wonder if my implementation was right. I'm familiar with WGAN-GP, so I tried RSGAN-GP first. I looked at the training curve of the loss of the discriminator, and found it fluctuating around 0.5. I wonder if this is a normal phenomena?

Also, I wonder if the idea of RGAN is extendable to hybrid models, ex. VAEGAN or collaborating with other MSE-like loss function?

AlexiaJM commented 6 years ago

I have not looked at the training curves so far because of tensorboard was disabled due to incompatibilities, but yes, this is what you would expect. It's a true zero-sum game between P and Q, both distributions share D(x) in [0,1]. D(x_real) increase above .50 which implicitly decreases D(x_f) below .50 and then D(x_f) increase above .50 which implicitly decrease D(x_r) below .50 and so on.

I wouldn't know, but I assume that it should, Relativistism can be used in almost every settings.

pvitoria commented 4 years ago

Hi Alexia, What about the curve of the generator, should be around 1?

Thanks in advance.

AlexiaJM commented 4 years ago

It should stay mostly flat around some small number.