neuroailab / tfutils

Utilities for working with tensorflow
MIT License
25 stars 8 forks source link

restoring optimizer state #72

Closed anayebi closed 6 years ago

anayebi commented 7 years ago

By default, it seems that the optimizer state is not saved in the checkpoint, so if you train with momentum, and then later want to retrain from that saved checkpoint, then the optimizer state (such as momentum update) is not reloaded, so the performance is not the same as if you trained it continuously (performance degrades).

If you look at how we are saving in the master code, we are loading a list of parameters that currently does not include the optimizer state.

Useful link: https://github.com/tensorflow/tensorflow/issues/5595