neuroailab / tfutils

Utilities for working with tensorflow
MIT License
25 stars 8 forks source link

When restoring a model, logs variables in checkpoint not in graph #124

Closed jvrsgsty closed 5 years ago

jvrsgsty commented 5 years ago

When restoring models, I've found it useful to see the full picture of which variables were restored and which were not. We have two overlapping sets of variables: the variables in the checkpoint and the variables in the graph.

We currently log:

I propose also logging variables in the checkpoint, not in the graph. This might help verify that the relevant checkpoint vars were restored, in case the current graph is only a subset of the checkpoint being loaded.

Potential improvements:

anayebi commented 5 years ago

@jvrsgsty have you tested that this correctly logs the variables that are 1) in the graph but not in the checkpoint (unrestored variables), 2) in the checkpoint but not in the graph (your change), and 3) in both (restored variables)? If not, I would pull the current version of master into your branch and test these 3 things.

jvrsgsty commented 5 years ago

@anayebi this worked fine when I tested it, but I can pull master in and test it out again and make sure that is the case :)

anayebi commented 5 years ago

Ok great -- did you run the unit tests? If so, I will pull it in.

jvrsgsty commented 5 years ago

Let me finish testing this with master + running the tests and I'll report back.

anayebi commented 5 years ago

Unit tests pass, and I get the correct outputs when I restore models with these changes. Optimizer variables are also filtered out for clarity. Pulling it into master now.