depthfirstlearning / depthfirstlearning.com

Other
196 stars 29 forks source link

InfoGAN: Looking for a Colab implementing a GAN for MNIST, with both the saturating and non-saturating GAN loss” #3

Open avital opened 6 years ago

avital commented 6 years ago

(Just a GAN implementation, no InfoGAN).

I'd like to show the specific differences in training for the two losses described in the original GAN paper.

A good Colab would include an abundance of text cells explaining exactly what each part is doing. You can put math in text cells, followed by TensorFlow code implementing that math.

maxisawesome commented 5 years ago

Are you still looking for someone to do this? There's an example of a DCGAN on MNIST here: https://colab.research.google.com/github/tensorflow/tensorflow/blob/master/tensorflow/contrib/eager/python/examples/generative_examples/dcgan.ipynb

I'd be willing to with saturating and non-saturating loss if you'd like. I would likely take code from said example, though I'd add stuff to demonstrate the differences in the loss functions. I can also simplify it - use dense instead of DCGAN, remove batch_norm, etc.

cinjon commented 5 years ago

That sounds great! Make it so and issue a PR when you are done 👍

MicPie commented 5 years ago

Thanks for setting up the colab notebook!

However, when running the colab notebook I stumbled over the following points:

Unfortunately, I am not so familiar with colab and Tensorflow and maybe I am doing something wrong?

What brought me here in the first place, was that I was looking into the loss functions and visualized them and their derivations. Therefore, I guess the implementation in the colab notebook of the saturating loss should be:

def s_generator_loss(generated_output):
    return tf.reduce_mean(tf.log(1-generated_output))

I.e., put the "1 -" into the log function.

I will try to implement a similar basic GAN example in PyTorch and get back to this thread when I have carried out further tests.

Kind regards Michael

MicPie commented 5 years ago

Hello,

I have now a first draft of the notebook on GitHub: https://nbviewer.jupyter.org/github/MicPie/DepthFirstLearning/blob/master/InfoGAN/DCGAN_MNIST_v2.ipynb It is heavily based on the PyTorch tutorial notebook and has some nice visualizations. The plotted gradient standard deviations seems to look ok, so it should work.

I will now polish the stuff and then contribute my notes back.

Kind regards Michael

avital commented 5 years ago

@MicPie Wow, this looks really good! Looking forward to putting your notebook into our content, once you're comfortable with the level of polish.

BTW once you're done, could you copy it over to Colab? That makes it easier for others to try it out and fork it to run their own experiments.

MicPie commented 5 years ago

Hey @avital, I polished the notebook and uploaded it to GitHub: https://nbviewer.jupyter.org/github/MicPie/DepthFirstLearning/blob/master/InfoGAN/DCGAN_MNIST_v5.ipynb

I also found an easy way to "colabify" GitHub notebooks just with a link: https://colab.research.google.com/github/MicPie/DepthFirstLearning/blob/master/InfoGAN/DCGAN_MNIST_v5.ipynb

The explanation in the notebook is based on my opened issue.

If you have suggestions etc. just let me know! :-)