Traceback (most recent call last):
File "demo2.py", line 51, in
saver.restore(isess, ckpt_filename)
Caused by op 'save/RestoreV2_33', defined at:
File "demo2.py", line 50, in
saver = tf.train.Saver()
DataLossError (see above for traceback): Checksum does not match: stored 2973489966 vs. calculated on the restored bytes 4004437628
[[Node: save/RestoreV2_33 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_save/Const_0_0, save/RestoreV2_33/tensor_names, save/RestoreV2_33/shape_and_slices)]]
[[Node: save/RestoreV2_76/_163 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device_incarnation=1, tensor_name="edge_884_save/RestoreV2_76", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:GPU:0"]()]]
When I run demo.py , it show :
Traceback (most recent call last): File "demo2.py", line 51, in
saver.restore(isess, ckpt_filename)
Caused by op 'save/RestoreV2_33', defined at: File "demo2.py", line 50, in
saver = tf.train.Saver()
DataLossError (see above for traceback): Checksum does not match: stored 2973489966 vs. calculated on the restored bytes 4004437628 [[Node: save/RestoreV2_33 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_save/Const_0_0, save/RestoreV2_33/tensor_names, save/RestoreV2_33/shape_and_slices)]] [[Node: save/RestoreV2_76/_163 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device_incarnation=1, tensor_name="edge_884_save/RestoreV2_76", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:GPU:0"]()]]