tensorflow / compression

Data compression in TensorFlow
Apache License 2.0
850 stars 248 forks source link

Could not find variable conv1/gdn_0/reparam_gamma #187

Open zw-xxx opened 9 months ago

zw-xxx commented 9 months ago

Describe the bug I am using tensorflow and tensorflow-compression to train a new model in keras. However, when the code runs to model.fit, the following error occurs:

Epoch 1/100
*** tensorflow.python.framework.errors_impl.FailedPreconditionError: 2 root error(s) found.
  (0) FAILED_PRECONDITION: Could not find variable conv1/gdn_0/reparam_gamma. This could mean that the variable has been deleted. In TF1, it can
also mean the variable is uninitialized. Debug info: container=localhost, status error message=Resource localhost/conv1/gdn_0/reparam_gamma/N10tensorflow3VarE does not exist.
         [[{{node conv1/gdn_0/gamma/lower_bound_1/ReadVariableOp}}]]
         [[loss/mul/_97]]
  (1) FAILED_PRECONDITION: Could not find variable conv1/gdn_0/reparam_gamma. This could mean that the variable has been deleted. In TF1, it can
also mean the variable is uninitialized. Debug info: container=localhost, status error message=Resource localhost/conv1/gdn_0/reparam_gamma/N10tensorflow3VarE does not exist.
         [[{{node conv1/gdn_0/gamma/lower_bound_1/ReadVariableOp}}]]
0 successful operations.
0 derived errors ignored.

This is my model:

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Conv2D
from tensorflow.keras.layers import LeakyReLU
from tensorflow.keras.layers import Input
from tensorflow_compression import GDN

def my_model:
    def __init__(self,input_shape,filter_num):
        self.input_shape = input_shape
        self.filter_num = filter_num
    def create_model(self):
        input = Input(shape=self.input_shape, name="input_1")
        conv1 = Conv2D(filters=self.filter_num, kernel_size=3, padding="same", name="conv1", activation=GDN(name="gdn_0"))
        x = conv1(input)
        # layer_gdn = GDN(name="gdn_0")
        # x = layer_gdn(x)
        output  = LeakyReLU(alpha=self.alpha, name="hidden_activation")(x)
        model = Model(inputs=input, outputs=output)
        if self.training:
            opt = keras.optimizers.Adam()
            model.compile(
                optimizer=opt,
                loss=MSELoss,
                experimental_run_tf_function=False,
            )
        return model

Besides, if I remove the activation=GDN(name="gdn_0") or replace it with others, the training goes well, so maybe it is a bug. What should I do to solve it?

Thanks.

System: