Open maxkohlbrenner opened 7 years ago
For a simple setting, all relu units die very quickly
AUTOENCODER SPECIFICATIONS filter_dims = [(5,5)] hidden_channels = [5] use_max_pooling = False strides = None # other strides should not work yet activation_function = 'relu' batch_size = 100 max_iterations = 50 chk_iterations = 10 step_size = 0.0001
A deeper autoencoder only learns something useful with a sigmoid activation for instance, for the weight transfer to the CNN, a relu activation would e more useful. Find out why it doesn't work.