google-research / mixmatch

Apache License 2.0
1.13k stars 163 forks source link

problem at defining the train_op in mixmatch.py line 94, #22

Closed zhangjinyangnwpu closed 4 years ago

zhangjinyangnwpu commented 4 years ago
train_op = tf.train.AdamOptimizer(lr).minimize(loss_xe + w_match * loss_l2u, colocate_gradients_with_ops=True)
        with tf.control_dependencies([train_op]):
            train_op = tf.group(*post_ops)

when you define train op like this, if the optimizer will be ignored and only the post_ops will be updated when training, and how is this work?

david-berthelot commented 4 years ago

The tf.control_dependencies ensures that AdamOptimizer is run before the operations contained in post_ops.

https://www.tensorflow.org/versions/r1.14/api_docs/python/tf/control_dependencies