There seem two ways to get all parameters(called in TL)/variables(called in TF), one provided by TL is all_params, the other provided by TF (and wrapped by TL) is get_variables_with_name().
I wonder what the difference between them is? I did some trials and found they are not same. all_params is always larger because it contains moving_mean and moving_variance of many layers. But sometimes it can also leave some important variables, like beta and gemma in some batch normalization layers, though not all.
I want to know which I should use in the following situation and why:
When I'm train a GAN, where I need to train the generator and discriminator separately, which should I use as the var_list in tf.train.Optimizer.minimize()?
When I want to save a trained model to disk for future load, which should I use as the params in tl.files.save_npz()?
I know there is a DCGAN repo written in TL and get_variables_with_name() is choosed. As well in the doc it is said to choose all_params for tl.files.save_npz(). But what if I make a different choice? Actually these two situations don't differ too much in my opinion, but why the answers are not same?
There seem two ways to get all parameters(called in TL)/variables(called in TF), one provided by TL is
all_params
, the other provided by TF (and wrapped by TL) isget_variables_with_name()
.I wonder what the difference between them is? I did some trials and found they are not same.
all_params
is always larger because it containsmoving_mean
andmoving_variance
of many layers. But sometimes it can also leave some important variables, likebeta
andgemma
in some batch normalization layers, though not all.I want to know which I should use in the following situation and why:
var_list
intf.train.Optimizer.minimize()
?params
intl.files.save_npz()
?I know there is a DCGAN repo written in TL and
get_variables_with_name()
is choosed. As well in the doc it is said to chooseall_params
fortl.files.save_npz()
. But what if I make a different choice? Actually these two situations don't differ too much in my opinion, but why the answers are not same?Thank you.