Neuroglycerin / neukrill-net-work

NDSB competition repository for scripting, note taking and writing submissions.
MIT License
0 stars 0 forks source link

Monitoring norms of gradient updates #65

Closed matt-graham closed 9 years ago

matt-graham commented 9 years ago

In this post it is mentioned one heuristic to set learning rate is to look at the ratio between the weight update norms and the weight norms and try to adjust the learning rate such that this is about 1/1000. Weight norms are already monitored by default by pylearn2 but we have no way of monitoring the gradient updates at the moment. It would be useful to add monitoring channels for this purpose.

matt-graham commented 9 years ago

Now possible using UpdateNormMonitorLearningRule class in neukrill-net-tools/neurkrill_net/update_norm_monitor.py.