Closed XiaoqingNLP closed 6 years ago
I think this problem is due to the model store file checkpoint .when I use default params path to save model ,it will save the relative path in checkpoint like this
all_model_checkpoint_paths: "model.ckpt-223793"
while I set a args.output in parameters .it will save with
all_model_checkpoint_paths: "/data/xqzhou/new_thumt_Dir/model/transformer-baseline_thumt_share_emb_softmax/model.ckpt-242000"
hooks.py ling 282
` if added is not None: old_path = os.path.join(self._base_dir, added) new_path = os.path.join(self._save_path, added) old_files = tf.gfile.Glob(old_path + "*") tf.logging.info("Copying %s to %s" % (old_path, new_path))
`
follow information is a simple reproduce the problem ,may be help for you to confirm
` import os added = '/data/xqzhou/new_thumt_Dir/model/transformer-baseline-1gpu/model.ckpt-2002' old_path = os.path.join('/data/xqzhou/new_thumt_Dir/model/transformer-baseline-1gpu/model.ckpt-2002', added) new_path = os.path.join('/data/xqzhou/new_thumt_Dir/model/transformer-baseline-1gpu/eval', added)
print("old_path:%s new_path:%s"%(old_path, new_path)) `
if my post is correct ,the following can fix this wrong information ,may be useful for later users.
hooks.py ling 282
if added is not None: old_path = os.path.join(self._base_dir, os.path.basename(added)) new_path = os.path.join(self._save_path, os.path.basename(added)) old_files = tf.gfile.Glob(old_path + "*") tf.logging.info("Copying %s to %s" % (old_path, new_path))