lmjohns3 / theanets

Neural network toolkit for Python
http://theanets.rtfd.org
MIT License
328 stars 73 forks source link

Recurrent network initialisation with ReLU #75

Closed kjancsi closed 9 years ago

kjancsi commented 9 years ago

This seems like an interesting, simple trick to initialise RNNs with ReLU units: http://arxiv.org/abs/1504.00941 Is there a simple way of doing this in theanets? Thanks.

lmjohns3 commented 9 years ago

Yes, you can specify a couple parameters when you create your layer that will effectively initialize the layer this way:

exp = theanets.Experiment(
    theanets.Autoencoder,
    layers=(10, dict(form='rnn', sparsity=0.999, radius=1, size=10), 10),
)

See the recurrent examples for other ways of specifying initial layer configurations.