The help message of train_image_classifier.py, as printed e.g. by python train_image_classifier.py --help, can be improved so that the following descriptions of optimizer-related flags become more accurate and easier to understand:
flag name
current description
dependency
new description candidate
momentum
"The momentum for the MomentumOptimizer and RMSPropOptimizer."
Has no effect when using RMSPropOptimizer.
"The momentum for MomentumOptimizer."
rmsprop_momentum
"Momentum."
Has effect only when using RMSPropOptimizer.
"The momentum for RMSPropOptimizer."
opt_epsilon
"Epsilon term for the optimizer."
Has effect only when using Adadelta, Adam, RMSProp.
"Epsilon term for AdadeltaOptimizer, AdamOptimizer, and RMSPropOptimizer."
adam_beta1
"The exponential decay rate for the 1st moment estimates."
Has effect only when using AdamOptimizer.
"The exponential decay rate for the 1st moment estimates. Relevant only for AdamOptimizer."
adam_beta2
"The exponential decay rate for the 2nd moment estimates."
Has effect only when using AdamOptimizer.
"The exponential decay rate for the 2nd moment estimates. Relevant only for AdamOptimizer."
ftrl_learning_rate_power
"The learning rate power."
Has effect only when using FtrlOptimizer.
"The FTRL learning rate power."
learning_rate_decay_factor
"Learning rate decay factor."
Has effect only when using exponential decay.
"Learning rate decay factor used by a exponential decay learning rate."
Prerequisites
1. The entire URL of the documentation with the issue
https://github.com/tensorflow/models/blob/master/research/slim/train_image_classifier.py
2. Describe the issue
The help message of train_image_classifier.py, as printed e.g. by
python train_image_classifier.py --help
, can be improved so that the following descriptions of optimizer-related flags become more accurate and easier to understand:momentum
rmsprop_momentum
opt_epsilon
adam_beta1
adam_beta2
ftrl_learning_rate_power
learning_rate_decay_factor