YingZhangDUT / Deep-Mutual-Learning

TensorFlow Implementation of Deep Mutual Learning
MIT License
318 stars 58 forks source link

1. You may build two nets, one uses the official model names, like "MobileNetV1" as variable scope, and the other use "net1" as variable scope. #8

Open sankin1770 opened 6 years ago

sankin1770 commented 6 years ago
  1. Restore the pretrained checkpoint to "MobileNetV1".
  2. Assign all the variables under "MobileNetV1" to "net1";

Originally posted by @YingZhangDUT in https://github.com/YingZhangDUT/Deep-Mutual-Learning/issues/4#issuecomment-388688149

sankin1770 commented 6 years ago

May I ask how to do the third step? I am looking forward to your reply.

YingZhangDUT commented 6 years ago

you can add the assign operation after init like this:

sess.run(init)

init_update_op = [tf.assign(b, a) for a, b in zip(net_var_list["{0}".format(0)], net_var_list["{0}".format(1)])] sess.run(init_update_op)

tf.train.start_queue_runners(sess=sess)

Here the name_scope of net_var_list["{0}".format(0)] should be 'MobilenetV1'.

sankin1770 commented 5 years ago

Thank you for your reply. I'm a novice at tensorflow. I use Saver = tf. train. Saver () - - saver. restore (sess, "resnetv1_50. ckpt") to restore the parameters of the model pre-training on Imagenet, But it doesn't seem to work Please tell me what variables should I fill in in the brackets of tf. train. Saver (*) ....

YingZhangDUT commented 5 years ago

To load pretrained models, you can learn from "Fine-tuning a model from an existing checkpoint " on https://github.com/tensorflow/models/tree/master/research/slim. Or you can check https://github.com/YingZhangDUT/Cross-Modal-Projection-Learning where I load pretrained resnet152.

YingZhangDUT commented 5 years ago

你可以设置两个Saver, 主函数的saver = tf.train.Saver(tf.global_variables())用来保存两个net的参数,restore的时候再设置一个 restorer = tf.train.Saver(variables_to_restore) 用来restore pretrain的参数。其中 variables_to_restore 则是所有MobileNetV1的参数。