thibo73800 / capsnet-traffic-sign-classifier

A Tensorflow implementation of CapsNet(Capsules Net) apply on german traffic sign dataset
Apache License 2.0
177 stars 90 forks source link

ValueError: At least two variables have the same name: conv2d_1/kernel/Adam #5

Closed AloshkaD closed 6 years ago

AloshkaD commented 6 years ago

When I test the model using a trained model I get the error below. My TensorFlow Version: 1.2.1
this error changes when I rerun the kernel into ValueError: At least two variables have the same name: weightand into ValueError: At least two variables have the same name: fully_connected/biases

`ModelBase::Loading ckpt ...
INFO:tensorflow:Restoring parameters from outputs/checkpoints/c1nf_16_c2d_0.7_lr_0.0001_c1s_5_rs_1_c1vl_16_c1n_256_c2s_6_c2n_64_c2vl_32_c1s_9--TrafficSign--1513264580.046234
Restoring parameters from outputs/checkpoints/c1nf_16_c2d_0.7_lr_0.0001_c1s_5_rs_1_c1vl_16_c1n_256_c2s_6_c2n_64_c2vl_32_c1s_9--TrafficSign--1513264580.046234
ModelBase::Ckpt ready
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-10-e5003bd16d6a> in <module>()
     89 skpt = "outputs/checkpoints/c1nf_16_c2d_0.7_lr_0.0001_c1s_5_rs_1_c1vl_16_c1n_256_c2s_6_c2n_64_c2vl_32_c1s_9--TrafficSign--1513264580.046234"
     90 testing_file = "dataset/"
---> 91 test(testing_file, skpt)

<ipython-input-10-e5003bd16d6a> in test(dataset, ckpt)
     66     model = ModelTrafficSign("TrafficSign", output_folder=None)
     67     # Load the model
---> 68     model.load(ckpt)
     69 
     70     # Evaluate all the dataset

/home/a/WA/CapsNet/capsnet-traffic-sign-classifier/model_base.py in load(self, ckpt)
    345 
    346         self.model_name = ckpt.split("/")[-1]
--> 347         self.saver = tf.train.Saver()
    348 
    349 

/home/a/anaconda3/envs/keras/lib/python3.5/site-packages/tensorflow/python/training/saver.py in __init__(self, var_list, reshape, sharded, max_to_keep, keep_checkpoint_every_n_hours, name, restore_sequentially, saver_def, builder, defer_build, allow_empty, write_version, pad_step_number, save_relative_paths)
   1137     self._pad_step_number = pad_step_number
   1138     if not defer_build:
-> 1139       self.build()
   1140     if self.saver_def:
   1141       self._check_saver_def()

/home/a/anaconda3/envs/keras/lib/python3.5/site-packages/tensorflow/python/training/saver.py in build(self)
   1168           keep_checkpoint_every_n_hours=self._keep_checkpoint_every_n_hours,
   1169           name=self._name,
-> 1170           restore_sequentially=self._restore_sequentially)
   1171     elif self.saver_def and self._name:
   1172       # Since self._name is used as a name_scope by builder(), we are

/home/a/anaconda3/envs/keras/lib/python3.5/site-packages/tensorflow/python/training/saver.py in build(self, names_to_saveables, reshape, sharded, max_to_keep, keep_checkpoint_every_n_hours, name, restore_sequentially, filename)
    671         unique.
    672     """
--> 673     saveables = self._ValidateAndSliceInputs(names_to_saveables)
    674     if max_to_keep is None:
    675       max_to_keep = 0

/home/a/anaconda3/envs/keras/lib/python3.5/site-packages/tensorflow/python/training/saver.py in _ValidateAndSliceInputs(self, names_to_saveables)
    555     """
    556     if not isinstance(names_to_saveables, dict):
--> 557       names_to_saveables = BaseSaverBuilder.OpListToDict(names_to_saveables)
    558 
    559     saveables = []

/home/a/anaconda3/envs/keras/lib/python3.5/site-packages/tensorflow/python/training/saver.py in OpListToDict(op_list)
    533         if name in names_to_saveables:
    534           raise ValueError("At least two variables have the same name: %s" %
--> 535                            name)
    536         names_to_saveables[name] = var
    537       # pylint: enable=protected-access

ValueError: At least two variables have the same name: conv2d_1/kernel/Adam`
AloshkaD commented 6 years ago

I fixed it! Apparently memory leak between kernels. I solved it by making sure only the influencing (testing) run on a fresh start.