The original ADAGE used tied weights (and was written in Theano). The previous ADAGE version in this repo did not have a tied weights constraint (meaning that the encoder and decoder weights were free to vary independently). We observed relatively poor performance of ADAGE models compared to other compression techniques (see simulation_results.md and z_dimensions_hyperparameter_sweep_results.md). We hypothesized that one of the reasons could be because of the lack of ADAGE tied weights.
There used to be an option to tie weights in the Keras autoencoder class (keras-team/keras#180, keras-team/keras#3261) but this has since been removed.
The pull request adds this functionality as a custom layer and also modifies the DataModel class to use these options.
The original ADAGE used tied weights (and was written in Theano). The previous ADAGE version in this repo did not have a tied weights constraint (meaning that the encoder and decoder weights were free to vary independently). We observed relatively poor performance of ADAGE models compared to other compression techniques (see simulation_results.md and z_dimensions_hyperparameter_sweep_results.md). We hypothesized that one of the reasons could be because of the lack of ADAGE tied weights.
There used to be an option to tie weights in the Keras autoencoder class (keras-team/keras#180, keras-team/keras#3261) but this has since been removed.
The pull request adds this functionality as a custom layer and also modifies the
DataModel
class to use these options.