Open yorkchu1995 opened 5 years ago
if you want to map variable A to B, the name(contains scope) of A&B should has the same scope level, eg, the scope of A is xx/xx/xxx/name_a, then, B should be yy/yy/yyy/name_b.
if you want to map variable A to B, the name(contains scope) of A&B should has the same scope level, eg, the scope of A is xx/xx/xxx/name_a, then, B should be yy/yy/yyy/name_b.
But it has the same scope level, i don't know why.
if you want to map variable A to B, the name(contains scope) of A&B should has the same scope level, eg, the scope of A is xx/xx/xxx/name_a, then, B should be yy/yy/yyy/name_b.
But it has the same scope level, i don't know why.
Hi,i got the same problem, did you solve this problem?
@xyfZzz Can you test if below workaround is useful?
Workaround: in modeling.py/get_assignment_map_from_checkpoint
- assignment_map[name] = name
+ assignment_map[name] = name_to_variable[name]
@xyfZzz Can you test if below workaround is useful?
Workaround: in
modeling.py/get_assignment_map_from_checkpoint
- assignment_map[name] = name + assignment_map[name] = name_to_variable[name]
I had the same problem, and for my case this workaround worked.
@xyfZzz Can you test if below workaround is useful?
Workaround: in
modeling.py/get_assignment_map_from_checkpoint
- assignment_map[name] = name + assignment_map[name] = name_to_variable[name]
I have the same problem, and this workaround worked.
I am facing the same problem. with with tf.variable_scope("loss"): I put a code in if else mode: one for a dense layer and another for a higway layer but I am also receiveing the same error * File "C:\Users\t-mdrahm\code\bert-dlis\model\model.py", line 267, in build_tf_graph self.num_input_bow_features_map) File "C:\Users\t-mdrahm\code\bert-dlis\model\model.py", line 105, in init self.load_checkpoint(init_checkpoint, self.known_task_names, within_scope=self.scope) File "C:\Users\t-mdrahm\code\bert-dlis\model\model.py", line 125, in load_checkpoint tf.train.init_from_checkpoint(init_checkpoint, assignment_map) File "C:\Users\t-mdrahm\AppData\Local\Programs\Python\Python37\lib\site-packages\tensorflow\python\training\checkpoint_utils.py", line 190, in init_from_checkpoint _init_from_checkpoint, args=(ckpt_dir_or_file, assignment_map)) File "C:\Users\t-mdrahm\AppData\Local\Programs\Python\Python37\lib\site-packages\tensorflow\python\distribute\distribute_lib.py", line 1516, in merge_call return self._merge_call(merge_fn, args, kwargs) File "C:\Users\t-mdrahm\AppData\Local\Programs\Python\Python37\lib\site-packages\tensorflow\python\distribute\distribute_lib.py", line 1524, in _merge_call return merge_fn(self._distribution_strategy, args, kwargs) File "C:\Users\t-mdrahm\AppData\Local\Programs\Python\Python37\lib\site-packages\tensorflow\python\training\checkpoint_utils.py", line 246, in _init_from_checkpoint scopes, tensor_name_in_ckpt)) ValueError: Assignment map with scope only name 61aef709/Passage/loss/highwayLayer/denseUnit0/denseUnit0Bias should map to scope only Passage/loss/highwayLayer/denseUnit0/denseUnit0Bias/Variable. Should be 'scope/': 'other_scope/'.
Workaround: in modeling.py/get_assignment_map_from_checkpoint IS NOT WORKING. please help
We add a attention layer in the end of model. It is okay when fine-tuning, but when I load saved model.ckpt again, it always fails as flows: