Closed cmanzo closed 1 year ago
It looks good to me!
I have a few suggestions:
It would be possible to include the optimizers as input parameters either by adding them in __init__
or by overriding the compile
method. This would give the user the ability to adjust them as needed, similar to what was done for the GAN.
Have you considered adjusting the code format to ensure it is compatible in style? One possible solution is to use the ms-python.black-formatter
, which can be found in Visual Studio Code.
It is important to include comprehensive documentation regarding the various input parameters of the model.
In the experiment done by the authors for MNIST, they used different learning rates for the autoencoder and the discriminator. 1e-3
and 5e-4
, respectively. Have you tried training with these values? Do you think it is worth keeping the values suggested by the authors as default? The current implementation utilizes 1e-3
for both networks.
It would be very convenient to include a unit test for this model!
I have included your suggestions, let me know how it looks!
Great! I think we are ready to merge!
I fixed a little typo in the test_models.py, let me know if it runs now
I've added the Wasserstein-GAN and modified the VAE so that the input size can be changed easily.