neuroailab / tfutils

Utilities for working with tensorflow
MIT License
25 stars 8 forks source link

Partial loading / training / backprop / learning rates #32

Closed yamins81 closed 7 years ago

yamins81 commented 7 years ago

It's often convenient to: (1) load some of the variables of a trained model in a new model, while some additional variables that are untrained are added, and then either keeping the old ones fixed or train them at a new (lower generally) training rate (e.g. finetuning, but other more sophisticated things are possible) (2) in general, have different learning rates on different parts of the model (3) or have some portion of a model training, while not having its errors back-propped to the upstream components.

APIs and code for all of these "partial" things should be added.