NicoRahm / CGvsPhoto

Computer Graphics vs Real Photographic Images : A Deep-learning approach
MIT License
25 stars 14 forks source link

How to run a pretrained model? #7

Closed ZhangYuef closed 1 year ago

ZhangYuef commented 4 years ago

Hi, thanks for the sharing!

I have a problem when I try to run the test part of your model using pretained parameters. I simply run the examples/test_total.py by inputing the name of ckpt as 32_32_64_remove_context_5x5_color_3chan_8500.ckpt.data-00000-of-00001.

And then I got a error

tensorflow.python.framework.errors_impl.DataLossError: Unable to open table file 
/CGvsPhoto/weights/32_32_64_remove_context_5x5_color_3chan_8500.ckpt.data-00000- 
of-00001: Data loss: not an sstable (bad magic number): perhaps your file is in a different 
file format and you need to use a different restore operator?

Maybe I input something wrong. What causes this? And how could I solve it?

simmimourya commented 4 years ago

I am facing a similar issue, although I am using just the name of the checkpoint until .ckpt (run10.ckpt) as an argument. I'm working with TensorFlow Version: 2.3.0

It throws the following error:

Name of the file to restore (Directory : ../../trained_weights/) : run10.ckpt 2020-08-04 05:45:31.392843: W tensorflow/core/framework/op_kernel.cc:1767] OP_REQUIRES failed at save_restore_v2_ops.cc:184 : Not found: Key Conv1/Bias/Variable/Adam not found in checkpoint Traceback (most recent call last):
File "/home/simmi/env2/torch37/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 1365, in _do_call return fn(*args) File "/home/simmi/env2/torch37/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 1350, in _run_fn target_list, run_metadata) File "/home/simmi/env2/torch37/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 1443, in _call_tf_sessionrun run_metadata) tensorflow.python.framework.errors_impl.NotFoundError: Key Conv1/Bias/Variable/Adam not found in checkpoint [[{{node save/RestoreV2}}]] During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/simmi/env2/torch37/lib/python3.7/site-packages/tensorflow/python/training/saver.py", line 1299, in restore {self.saver_def.filename_tensor_name: save_path}) File "/home/simmi/env2/torch37/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 958, in run run_metadata_ptr)
File "/home/simmi/env2/torch37/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 1181, in _run feed_dict_tensor, options, run_metadata)

File "/home/simmi/env2/torch37/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 1359, in _do_run run_metadata)
File "/home/simmi/env2/torch37/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 1384, in _do_call raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.NotFoundError: Key Conv1/Bias/Variable/Adam not found in checkpoint [[node save/RestoreV2 (defined at /home/simmi/env2/torch37/lib/python3.7/site-packages/CGvsPhoto/model.py:1044) ]]

I'm not sure what's going wrong here. How should I fix it? Thank you!

simmimourya commented 4 years ago

Hi, I was able to solve the issue! The error was a result of a minor inconsistency between the arguments while training and testing.

In test_total.py remove_context is set to False

clf = Model(database_path, image_size, config = 'Server', filters = [32, 32, 64],
            batch_size = 50, feature_extractor = 'Stats', remove_context = False,
            remove_filter_size = 5, only_green = False)

During training it is set to True (test_pipeline.py)

clf = Model(database_path, image_size, config = 'Personal', filters = [32,32,64],
            batch_size = 50, feature_extractor = 'Stats', remove_context = True, 
            remove_filter_size = 5, only_green = False)

Please make sure they are consistent!

Also, during testing when the script prompts to enter the path to pretrained weights, you can just enter the name until the extension ckpt, for example: _my_firstrun.ckpt

A big thank you to all the authors for their incredible work! :D