Anmol6 / DNGO-BO

Bayesian optimization with DNGO (Deep Networks for Global Optimization)
11 stars 4 forks source link

Some optimizers need re-intialization of variables #11

Closed Anmol6 closed 7 years ago

Anmol6 commented 7 years ago

Tflow Optimizers (excluding gradient descent) create additional variables when added to a tflow graph. These variables require initialisation before running the optimizer. which can be undesirable if we're already using learned parameters.

To overcome this, the following trick can be used:

#store all variables before creating the optimiser 
vars_before = set(tf.all_variables())

#Create the optimiser, store all variables
opt = tf.train.AdamOptimizer()
opt_op = opt.minimize(...)
vars_after = set(tf.all_variables())

#intialize only the new variables, created by the optimizer
sess.run(tf.initialize_all_variables(vars_after - vars_before))