taimir / infogan-keras

Implementation of InfoGAN in keras
MIT License
1 stars 2 forks source link

Assertion error #12

Open jbmaxwell opened 7 years ago

jbmaxwell commented 7 years ago

I'm trying to run your mnist example, and I'm getting: File "/home/jbmaxwell/anaconda3/envs/tensorflow/lib/python3.6/site-packages/keras/callbacks.py", line 764, in on_epoch_end assert len(val_data) == len(tensors) AssertionError As a note, I do get warnings about memory (I'm on a 6GB GTX 1060), but it does indicate that "this is not a failure".

taimir commented 7 years ago

I'll check it out some time today, thanks for the feedback :).

taimir commented 7 years ago

It was an observer issue - the code is now updated, feel free to notify me if you experience other issues.

jbmaxwell commented 7 years ago

Great, it appears to be training now. I'm wondering, though, is there a setting for the number of iterations, or is there some other stopping condition? I'm on roughly the 4500th iteration...

taimir commented 7 years ago

Since the implementation in this repo as of now is GAN and not WGAN (disregarding the InfoGAN specifics for a second), the losses of the discriminator and generator are not necessarily interpretable during training. So it is hard to build an automatic stopping condition based on those - currently it's up to you to judge when to stop the training based on the outputs in tensorboard (i.e. the Images tab for MNIST).

I have a WGAN implementation based on this repository lying around, and I plan to upstream it in the near future. In a WGAN, the behaviour of the D is generally much more stable, so there one could stop when the D loss saturates at a maximal value (as that would represent a saddle point in the optimization of the Wasserstein distance).

jbmaxwell commented 7 years ago

Right, okay. That makes sense. Thanks.

jbmaxwell commented 7 years ago

I noticed, in the code, there's a flag for recurrent_dim... Can you shed any light on how I might use that? Presenting my data as a sequence could work well, so I'm curious.