ixxi-dante / an2vec

Bringing node2vec and word2vec together for cool stuff
GNU General Public License v3.0
22 stars 6 forks source link

Read "Efficient Backprop" to check for mistakes in the implementation #7

Closed wehlutyk closed 6 years ago

wehlutyk commented 6 years ago

LeCun et al., 1998, "Efficient backprop" gives many tips and tricks for good neural networks.

For instance, some people on SO say that ReLU is not a good activation for auto-encoders as it loses more information than, say, tanh. Check if LeCun has opinions about this and more.

wehlutyk commented 6 years ago

Moving on with this, as it will make clearer what needs to be done for sparse feature normalisation and centring (#24).

wehlutyk commented 6 years ago

Notes:

wehlutyk commented 6 years ago
wehlutyk commented 6 years ago

For the activation function, another solution is to use the selu activation which should take care of normalisation on its own.

wehlutyk commented 6 years ago

Done reading "Efficient backprop". So the changes above need to be implemented now (it's all pretty simple). Next:

wehlutyk commented 6 years ago

Closing this as it's read, and referenced in https://github.com/ixxi-dante/nw2vec/projects/1#card-13000883.