Closed kjancsi closed 9 years ago
Yes, you can specify a couple parameters when you create your layer that will effectively initialize the layer this way:
exp = theanets.Experiment(
theanets.Autoencoder,
layers=(10, dict(form='rnn', sparsity=0.999, radius=1, size=10), 10),
)
See the recurrent examples for other ways of specifying initial layer configurations.
This seems like an interesting, simple trick to initialise RNNs with ReLU units: http://arxiv.org/abs/1504.00941 Is there a simple way of doing this in theanets? Thanks.