DeepTrackAI / DeepTrack2

DeepTrack2
MIT License
157 stars 47 forks source link

added WAE (both MMD and GAN) #185

Closed cmanzo closed 1 year ago

cmanzo commented 1 year ago

I've added the Wasserstein-GAN and modified the VAE so that the input size can be changed easily.

JesusPinedaC commented 1 year ago

It looks good to me!

I have a few suggestions:

  1. It would be possible to include the optimizers as input parameters either by adding them in __init__ or by overriding the compile method. This would give the user the ability to adjust them as needed, similar to what was done for the GAN.

  2. Have you considered adjusting the code format to ensure it is compatible in style? One possible solution is to use the ms-python.black-formatter, which can be found in Visual Studio Code.

  3. It is important to include comprehensive documentation regarding the various input parameters of the model.

  4. In the experiment done by the authors for MNIST, they used different learning rates for the autoencoder and the discriminator. 1e-3 and 5e-4, respectively. Have you tried training with these values? Do you think it is worth keeping the values ​​suggested by the authors as default? The current implementation utilizes 1e-3 for both networks.

  5. It would be very convenient to include a unit test for this model!

cmanzo commented 1 year ago

I have included your suggestions, let me know how it looks!

JesusPinedaC commented 1 year ago

Great! I think we are ready to merge!

cmanzo commented 1 year ago

I fixed a little typo in the test_models.py, let me know if it runs now